Linear Independence of Polynomials

psholtz
Messages
133
Reaction score
0

Homework Statement


Given a set of polynomials in x:

x^{r_1}, x^{r_2},...,x^{r_n}

where r_i \neq r_j for all i \neq j (in other words, the powers are distinct), where the functions are defined on an interval (a,b) where 0 < a < x < b (specifically, x \neq 0), I'd like to show that this set of functions is linearly indepedent.

Homework Equations


See above.


The Attempt at a Solution


I would proceed by induction.

Clearly, for n=1, this is true, since:

a_1 x^{r_1} = 0

necessarily implies (since x \neq 0) that a_1=0.

Suppose now that it is true for n-1 polynomials, and consider the expression relating the n polynomials:

a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0

Divide by x^{r_1}:

a_1 + a_2x^{r_2-r_1} + ... + a_nx^{r_n-r_1} = 0

and differentiate:

a_2(r_2-r_1)x^{r_2-r_1-1} + ... + a_n(r_n-r_1)x^{r_n-r_1-1} = 0

But now we have a linear combination of n-1 polynomials, and since x \neq 0 and since r_i \neq r_j for all i \neq j, we must necessarily have (based on our construction/presumption) that:

a_2 = ... = a_n = 0

Therefore, the expression:

a_1x^{r_1} + ... + a_nx^{r_n} = 0

necessarily implies that:

a_2 = ... = a_n = 0

So the expression:

a_1x^{r_1} + ... + a_nx^{r_n} = 0

now reduces to:

a_1x^{r_1} = 0

but, as we showed before, this necessarily implies that a_1=0. Hence, if

a_1x^{r_1} + ... + a_nx^{r_n} = 0

implies that a_1=...=a_n=0 and hence the polynomials are linearly independent.

What I'm wondering is: is this line of reasoning correct?
 
Physics news on Phys.org
One technical problem with the proof is that you're assuming r1 is the smallest ri. If it's not, when you divide by xr1, you don't end up with a polynomial.

What I think is more serious is that the induction hypothesis says that xr1, xr2, ..., xrn-1 are independent. That's not the same as saying any n-1 powers of x are independent, and in particular, it doesn't mean that xr2-r1-1, xr3-r1-1, ..., xrn-r1-1 are necessarily independent.
 
vela said:
One technical problem with the proof is that you're assuming r1 is the smallest ri. If it's not, when you divide by xr1, you don't end up with a polynomial.
Good point, but that may be ambiguity more to my use of the word "polynomial"...

What I mean is simply powers of x raised to some real number r_i. After all, x^{-2} and x^{-3} are linearly independent also, yes?

What I think is more serious is that the induction hypothesis says that xr1, xr2, ..., xrn-1 are independent. That's not the same as saying any n-1 powers of x are independent, and in particular, it doesn't mean that xr2-r1-1, xr3-r1-1, ..., xrn-r1-1 are necessarily independent.
True, but we could just multiply by x^{r_1+1} to get back to the induction hypothesis.

Let me word it more carefully. Suppose we take as our induction hypothesis that:

a_1x^{r_1} + a_2x^{r_2} + ... + a_{n-1}x^{r_{n-1}} = 0

implies that

a_1 = ... = a_{n-1} = 0

Now consider the expression:

a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0

Divide by x^{r_n}, to get:

a_1x^{r_1-r_n} + a_2x^{r_2-r_n} + ... + a_{n-1}x^{r_{n-1} - r_n} + a_n = 0

Differentiate:

a_1(r_1-r_n)x^{r_1-r_n-1} + a_2(r_2-r_n)x^{r_2-r_n-1} + ... + a_{n-1}(r_{n-1} - r_n)x^{r_{n-1}-r_n-1} = 0

Multiply by x^{r_n+1} to get:

a_1(r_1-r_n)x^{r_1} + a_2(r_2-r_n)x^{r_2} + ... + a_{n_1}(r_{n-1} - r_n)x^{r_{n-1}} = 0

We now have a linear combination of the terms x^{r_1},x^{r_2},...,x^{r_{n-1}}, and this linear combination is equal to 0. Now the induction hypothesis says that any such linear combination implies that all the coefficients are zero (i.e., the terms of the combination are linearly independent) ...

We can, for instance, write:

a_i' = a_i(r_i-r_n)

to get:

a_1'x^{r_1} + a_2'x^{r_2} + ... + a_{n-1}'x^{r_{n-1}} = 0

which, by the induction hypothesis, implies that:

a_1' = ... = a_{n-1}' = 0

Because a_i' = a_i(r_i-r_n) = 0 and because r_i \neq r_n for 1 \leq i < n, we conclude that a_i = 0 for 1 \leq i < n. Thus, if we are given:

a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0

we can conclude that:

a_1 = ... = a_{n-1} = 0

and so that

a_nx^{r_n} = 0

But, as already argued above, x \neq 0, we must have: a_n=0. Thus, if

a_1x^{r_1} + ... + a_nx^{r_n} = 0

then that a_1 = ... = a_n = 0, or that the functions are linearly independent.
 
Looks okay to me now.
 
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top