How Do You Prove Equivalence of Two Polynomials?

jeremy22511
Messages
29
Reaction score
0
Can somebody prove the equivalence statement of two real polynomials in one variable x for me? My Math teacher just told us to remember it as a definition and so I didn't get any proof for it; I attempted to prove it myself and ended up confusing myself with a lot of symbols.

So, can somebody help me with this?

Thanks
Jeremy
 
Mathematics news on Phys.org
So, what exactly does this "equivalence statement" state?
 
The only thing I can think of is that if two polynomials are equal for all x, then they have the same degree and corresponding coeffcients are equal.

If a_0+ a_1x+ a_2x^2++ \cdot\cdot\cdot\+ a_nx^n=b_0+ b_1x+ b_2x^2+ \cdot\cdot\cdot+ b_nx^n for all x, then we must have
(a_0- b_0)+ (a_1- b_1)x+ (a_2- b_2)x^2+\cdot\cdot\cdot+ (a_n- b_n)x^n= 0 so it is sufficient to show that if
a_0+ a_1x+ a_2x^2+ \cdot\cdot\cdot+ a_nx^n= 0 for all x then a_0= a_1= a_2= \cdot\cdot\cdot= a_n= 0. That is, prove that the functions, 1, x, x^2, ..., x^n are "independent".

One way to do that is to take n different values for x, say x= 0, 1, 2, ..., n, to get n equations to solve and show that those equations are independent: x= 0 gives a_0= 0 so that's easy, x= 1 gives a_0+ a_1+ a_2+ \cdot\cdot\cdot+ a_n= 0, x= 2 gives a_0+ 2a_1+ 4a_2+ \cdot\cdot\cdot 2^n a_n= 0, etc.

More sophisticated but simpler is to note that if a_0+ a_1x+ a_2x^2+ \cdot\cdot\cdot+ a_nx^n= 0 for all x, then it is a constant so its derivative, a_1+ 2a_2x+ \cdot\cdot\cdot+ na_nx^{n-1} i also equal to 0 for all x and so its derivative if 0 for all x, etc. Setting x= 0 in the formula for the polynomial and all of its derivatives gives a_0= 0, a_1= 0, 2a_2= 0, ..., n! a_n= 0 which again say that all coefficients are 0.
 
Last edited by a moderator:
Thanks. That really helped.

Jeremy
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.
Back
Top