- #1
naptor
- 13
- 0
There is a theorem in algebra, whose name I don't recall, that states that given a polynomial and its roots I can easily factor it so for instance :
[itex]p(x)=x^2-36[/itex] ,
assuming that p(x) is a real function,
[itex]p(0)=0 \Leftrightarrow x=6,-6 [/itex]
then p(x) can be written as :
[itex]P(x)=(x-6)(x+6)[/itex]
I was trying to derive the good and old difference of two cubes expression using this factorization technique, then :
let [itex] p(x)=x^3-a^3 [/itex] and [itex]p(x)=0 \Leftrightarrow x=a[/itex] then [itex] p(x)=(x-a)(x-a)(x-a)=(x-a)(x^2 -2xa+a^2) ,[/itex] while the formula should be [itex] (x-a)(x^2+ax+a^2)[/itex]
What's wrong?
[itex]p(x)=x^2-36[/itex] ,
assuming that p(x) is a real function,
[itex]p(0)=0 \Leftrightarrow x=6,-6 [/itex]
then p(x) can be written as :
[itex]P(x)=(x-6)(x+6)[/itex]
I was trying to derive the good and old difference of two cubes expression using this factorization technique, then :
let [itex] p(x)=x^3-a^3 [/itex] and [itex]p(x)=0 \Leftrightarrow x=a[/itex] then [itex] p(x)=(x-a)(x-a)(x-a)=(x-a)(x^2 -2xa+a^2) ,[/itex] while the formula should be [itex] (x-a)(x^2+ax+a^2)[/itex]
What's wrong?
Last edited: