naptor
- 12
- 0
There is a theorem in algebra, whose name I don't recall, that states that given a polynomial and its roots I can easily factor it so for instance :
p(x)=x^2-36 ,
assuming that p(x) is a real function,
p(0)=0 \Leftrightarrow x=6,-6
then p(x) can be written as :
P(x)=(x-6)(x+6)
I was trying to derive the good and old difference of two cubes expression using this factorization technique, then :
let p(x)=x^3-a^3 and p(x)=0 \Leftrightarrow x=a then p(x)=(x-a)(x-a)(x-a)=(x-a)(x^2 -2xa+a^2) , while the formula should be (x-a)(x^2+ax+a^2)
What's wrong?
p(x)=x^2-36 ,
assuming that p(x) is a real function,
p(0)=0 \Leftrightarrow x=6,-6
then p(x) can be written as :
P(x)=(x-6)(x+6)
I was trying to derive the good and old difference of two cubes expression using this factorization technique, then :
let p(x)=x^3-a^3 and p(x)=0 \Leftrightarrow x=a then p(x)=(x-a)(x-a)(x-a)=(x-a)(x^2 -2xa+a^2) , while the formula should be (x-a)(x^2+ax+a^2)
What's wrong?
Last edited: