# Remainder Theorem proof

1. Aug 8, 2009

### avec_holl

1. The problem statement, all variables and given/known data

Prove that for any polynomial function $$f$$ and number $$a$$, there exists a polynomial function $$g$$ and number $$b$$ such that: $$f(x) = (x-a)g(x) + b$$

2. Relevant equations

N/A

3. The attempt at a solution

Proof: Let $$P(n)$$ be the statement that for some natural number $$n$$,

$$f(x) = a_nx^n + \dots + a_0 = (x-\alpha)g(x) + \beta$$

Clearly $$P(1)$$ is true since we have that $$a_1x + a_0 = (x-\alpha)(a_1) + \beta$$. Now suppose that $$P(k)$$ holds. To complete the proof, we need only show that if $$P(k)$$ holds then $$P(k+1)$$ also holds. Considering the polynomial function $$f$$ defined such that,

$$f(x) = a_{k+1}x^{k+1} + \dots + a_0 = a_{k+1}x^{k+1}\; + \;(x-\alpha)g(x)\; + \;b_1$$

$$f(x) = a_{k+1}(x)(x^k)\; + \;(x-\alpha)g(x)\; + \;b_1$$

Applying the Remainder Theorem to the polynomials $$a_{k+1}x$$ and $$x^k$$ - we've already proved this when $$n=1$$ and assumed that it holds for $$n=k$$ so we can apply it to those polynomials. This yields,

$$f(x) = [(x-\alpha)(a_{k+1}) + b_2][(x-\alpha)h(x) + b_3] + (x-\alpha)g(x) + b_1$$

$$f(x) = a_{k+1}(x-\alpha)^2h(x) + b_2(x-\alpha)h(x) + a_{k+1}b_3(x-\alpha) + (x-\alpha)g(x) + b_1 + b_2b_3$$

$$f(x) = (x-\alpha)[a_{k+1}(x-\alpha)h(x) + b_2h(x) + g(x) + a_{k+1}b_3] + b_1+b_2b_3$$

Letting $$L(x) = a_{k+1}(x-\alpha)h(x) + b_2h(x) + g(x) + a_{k+1}b_3$$ and $$\beta = b_1 + b_2b_3$$ we find that,

$$f(x) = (x-\alpha)L(x) + \beta$$

Is this a valid proof? It seems awfully fishy . . . if anyone could point out mistakes and make suggestions it would be much appreciated. Thanks!

2. Aug 9, 2009

### Elucidus

The statement you have is not the usual statement of the Remainder Theorem that I am familiar with. I usually see it stated as:

For any polynomial P and real number a, if r is the remainder when P is divided by (x -a ), then r = P(a).

What you stated is a specific instance of the Division Algorithm for Polynomials. But your proof seems to have met the challenge of the original question regardless. Induction works, but if you are assuming the Division Algorithm for Polynomials is established then you can prove your statement more directly.

For reference:

DIVISION ALGORITHM FOR POLYNOMIALS
Given any two polynomials f and d (d not identically 0), there exist unique polynomials q and r so that

$$f(x)=d(x)\cdot{q(x)}+r(x)$$

where either r(x) = 0 or $deg(r) < deg(d)$.

EDIT: Changed $\leq$ to < in previous line.

--Elucidus

Last edited: Aug 9, 2009
3. Aug 11, 2009

### avec_holl

Thanks for your reply! Sorry about the misnomer, my text book refers to this result as the Remainder Theorem but I guess it isn't.