Linear Independence of Polynomials

Click For Summary

Homework Help Overview

The discussion revolves around the linear independence of a set of polynomials defined as \(x^{r_1}, x^{r_2},...,x^{r_n}\) with distinct powers \(r_i\). The original poster seeks to demonstrate that these polynomials are linearly independent on an interval where \(0 < a < x < b\) and \(x \neq 0\).

Discussion Character

  • Exploratory, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to use mathematical induction to prove the linear independence of the polynomials. Some participants question the validity of the induction hypothesis and the assumptions made regarding the smallest power of \(x\). Others suggest alternative approaches to the proof, including rephrasing the induction hypothesis and considering different polynomial combinations.

Discussion Status

The discussion is ongoing, with participants providing feedback on the original proof attempt. Some have expressed concerns about the assumptions made, while others have offered clarifications and alternative reasoning. There appears to be a productive exchange of ideas, though no consensus has been reached yet.

Contextual Notes

Participants have noted potential ambiguities in the definitions used, particularly regarding the term "polynomial" and the implications of the induction hypothesis. The discussion also highlights the importance of the distinctness of the powers \(r_i\) in establishing linear independence.

psholtz
Messages
133
Reaction score
0

Homework Statement


Given a set of polynomials in x:

[tex]x^{r_1}, x^{r_2},...,x^{r_n}[/tex]

where [tex]r_i \neq r_j[/tex] for all [tex]i \neq j[/tex] (in other words, the powers are distinct), where the functions are defined on an interval [tex](a,b)[/tex] where [tex]0 < a < x < b[/tex] (specifically, [tex]x \neq 0[/tex]), I'd like to show that this set of functions is linearly indepedent.

Homework Equations


See above.


The Attempt at a Solution


I would proceed by induction.

Clearly, for n=1, this is true, since:

[tex]a_1 x^{r_1} = 0[/tex]

necessarily implies (since [tex]x \neq 0[/tex]) that [tex]a_1=0[/tex].

Suppose now that it is true for n-1 polynomials, and consider the expression relating the n polynomials:

[tex]a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0[/tex]

Divide by [tex]x^{r_1}[/tex]:

[tex]a_1 + a_2x^{r_2-r_1} + ... + a_nx^{r_n-r_1} = 0[/tex]

and differentiate:

[tex]a_2(r_2-r_1)x^{r_2-r_1-1} + ... + a_n(r_n-r_1)x^{r_n-r_1-1} = 0[/tex]

But now we have a linear combination of n-1 polynomials, and since [tex]x \neq 0[/tex] and since [tex]r_i \neq r_j[/tex] for all [tex]i \neq j[/tex], we must necessarily have (based on our construction/presumption) that:

[tex]a_2 = ... = a_n = 0[/tex]

Therefore, the expression:

[tex]a_1x^{r_1} + ... + a_nx^{r_n} = 0[/tex]

necessarily implies that:

[tex]a_2 = ... = a_n = 0[/tex]

So the expression:

[tex]a_1x^{r_1} + ... + a_nx^{r_n} = 0[/tex]

now reduces to:

[tex]a_1x^{r_1} = 0[/tex]

but, as we showed before, this necessarily implies that [tex]a_1=0[/tex]. Hence, if

[tex]a_1x^{r_1} + ... + a_nx^{r_n} = 0[/tex]

implies that [tex]a_1=...=a_n=0[/tex] and hence the polynomials are linearly independent.

What I'm wondering is: is this line of reasoning correct?
 
Physics news on Phys.org
One technical problem with the proof is that you're assuming r1 is the smallest ri. If it's not, when you divide by xr1, you don't end up with a polynomial.

What I think is more serious is that the induction hypothesis says that xr1, xr2, ..., xrn-1 are independent. That's not the same as saying any n-1 powers of x are independent, and in particular, it doesn't mean that xr2-r1-1, xr3-r1-1, ..., xrn-r1-1 are necessarily independent.
 
vela said:
One technical problem with the proof is that you're assuming r1 is the smallest ri. If it's not, when you divide by xr1, you don't end up with a polynomial.
Good point, but that may be ambiguity more to my use of the word "polynomial"...

What I mean is simply powers of x raised to some real number [tex]r_i[/tex]. After all, [tex]x^{-2}[/tex] and [tex]x^{-3}[/tex] are linearly independent also, yes?

What I think is more serious is that the induction hypothesis says that xr1, xr2, ..., xrn-1 are independent. That's not the same as saying any n-1 powers of x are independent, and in particular, it doesn't mean that xr2-r1-1, xr3-r1-1, ..., xrn-r1-1 are necessarily independent.
True, but we could just multiply by [tex]x^{r_1+1}[/tex] to get back to the induction hypothesis.

Let me word it more carefully. Suppose we take as our induction hypothesis that:

[tex]a_1x^{r_1} + a_2x^{r_2} + ... + a_{n-1}x^{r_{n-1}} = 0[/tex]

implies that

[tex]a_1 = ... = a_{n-1} = 0[/tex]

Now consider the expression:

[tex]a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0[/tex]

Divide by [tex]x^{r_n}[/tex], to get:

[tex]a_1x^{r_1-r_n} + a_2x^{r_2-r_n} + ... + a_{n-1}x^{r_{n-1} - r_n} + a_n = 0[/tex]

Differentiate:

[tex]a_1(r_1-r_n)x^{r_1-r_n-1} + a_2(r_2-r_n)x^{r_2-r_n-1} + ... + a_{n-1}(r_{n-1} - r_n)x^{r_{n-1}-r_n-1} = 0[/tex]

Multiply by [tex]x^{r_n+1}[/tex] to get:

[tex]a_1(r_1-r_n)x^{r_1} + a_2(r_2-r_n)x^{r_2} + ... + a_{n_1}(r_{n-1} - r_n)x^{r_{n-1}} = 0[/tex]

We now have a linear combination of the terms [tex]x^{r_1},x^{r_2},...,x^{r_{n-1}}[/tex], and this linear combination is equal to 0. Now the induction hypothesis says that any such linear combination implies that all the coefficients are zero (i.e., the terms of the combination are linearly independent) ...

We can, for instance, write:

[tex]a_i' = a_i(r_i-r_n)[/tex]

to get:

[tex]a_1'x^{r_1} + a_2'x^{r_2} + ... + a_{n-1}'x^{r_{n-1}} = 0[/tex]

which, by the induction hypothesis, implies that:

[tex]a_1' = ... = a_{n-1}' = 0[/tex]

Because [tex]a_i' = a_i(r_i-r_n) = 0[/tex] and because [tex]r_i \neq r_n[/tex] for [tex]1 \leq i < n[/tex], we conclude that [tex]a_i = 0[/tex] for [tex]1 \leq i < n[/tex]. Thus, if we are given:

[tex]a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0[/tex]

we can conclude that:

[tex]a_1 = ... = a_{n-1} = 0[/tex]

and so that

[tex]a_nx^{r_n} = 0[/tex]

But, as already argued above, [tex]x \neq 0[/tex], we must have: [tex]a_n=0[/tex]. Thus, if

[tex]a_1x^{r_1} + ... + a_nx^{r_n} = 0[/tex]

then that [tex]a_1 = ... = a_n = 0[/tex], or that the functions are linearly independent.
 
Looks okay to me now.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
14
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K