Linear Independence of Polynomials

Click For Summary
The discussion focuses on proving the linear independence of a set of distinct polynomials of the form x^{r_1}, x^{r_2}, ..., x^{r_n} on the interval (a, b) where x ≠ 0. The proof uses mathematical induction, starting with the base case of one polynomial, which is trivially independent. The main argument involves showing that if a linear combination of these polynomials equals zero, then all coefficients must be zero, thus confirming their independence. Concerns are raised about the assumption that the smallest power is used in the induction step, but adjustments are made to address this by manipulating the expressions correctly. Ultimately, the conclusion is reached that the set of polynomials is indeed linearly independent.
psholtz
Messages
133
Reaction score
0

Homework Statement


Given a set of polynomials in x:

x^{r_1}, x^{r_2},...,x^{r_n}

where r_i \neq r_j for all i \neq j (in other words, the powers are distinct), where the functions are defined on an interval (a,b) where 0 < a < x < b (specifically, x \neq 0), I'd like to show that this set of functions is linearly indepedent.

Homework Equations


See above.


The Attempt at a Solution


I would proceed by induction.

Clearly, for n=1, this is true, since:

a_1 x^{r_1} = 0

necessarily implies (since x \neq 0) that a_1=0.

Suppose now that it is true for n-1 polynomials, and consider the expression relating the n polynomials:

a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0

Divide by x^{r_1}:

a_1 + a_2x^{r_2-r_1} + ... + a_nx^{r_n-r_1} = 0

and differentiate:

a_2(r_2-r_1)x^{r_2-r_1-1} + ... + a_n(r_n-r_1)x^{r_n-r_1-1} = 0

But now we have a linear combination of n-1 polynomials, and since x \neq 0 and since r_i \neq r_j for all i \neq j, we must necessarily have (based on our construction/presumption) that:

a_2 = ... = a_n = 0

Therefore, the expression:

a_1x^{r_1} + ... + a_nx^{r_n} = 0

necessarily implies that:

a_2 = ... = a_n = 0

So the expression:

a_1x^{r_1} + ... + a_nx^{r_n} = 0

now reduces to:

a_1x^{r_1} = 0

but, as we showed before, this necessarily implies that a_1=0. Hence, if

a_1x^{r_1} + ... + a_nx^{r_n} = 0

implies that a_1=...=a_n=0 and hence the polynomials are linearly independent.

What I'm wondering is: is this line of reasoning correct?
 
Physics news on Phys.org
One technical problem with the proof is that you're assuming r1 is the smallest ri. If it's not, when you divide by xr1, you don't end up with a polynomial.

What I think is more serious is that the induction hypothesis says that xr1, xr2, ..., xrn-1 are independent. That's not the same as saying any n-1 powers of x are independent, and in particular, it doesn't mean that xr2-r1-1, xr3-r1-1, ..., xrn-r1-1 are necessarily independent.
 
vela said:
One technical problem with the proof is that you're assuming r1 is the smallest ri. If it's not, when you divide by xr1, you don't end up with a polynomial.
Good point, but that may be ambiguity more to my use of the word "polynomial"...

What I mean is simply powers of x raised to some real number r_i. After all, x^{-2} and x^{-3} are linearly independent also, yes?

What I think is more serious is that the induction hypothesis says that xr1, xr2, ..., xrn-1 are independent. That's not the same as saying any n-1 powers of x are independent, and in particular, it doesn't mean that xr2-r1-1, xr3-r1-1, ..., xrn-r1-1 are necessarily independent.
True, but we could just multiply by x^{r_1+1} to get back to the induction hypothesis.

Let me word it more carefully. Suppose we take as our induction hypothesis that:

a_1x^{r_1} + a_2x^{r_2} + ... + a_{n-1}x^{r_{n-1}} = 0

implies that

a_1 = ... = a_{n-1} = 0

Now consider the expression:

a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0

Divide by x^{r_n}, to get:

a_1x^{r_1-r_n} + a_2x^{r_2-r_n} + ... + a_{n-1}x^{r_{n-1} - r_n} + a_n = 0

Differentiate:

a_1(r_1-r_n)x^{r_1-r_n-1} + a_2(r_2-r_n)x^{r_2-r_n-1} + ... + a_{n-1}(r_{n-1} - r_n)x^{r_{n-1}-r_n-1} = 0

Multiply by x^{r_n+1} to get:

a_1(r_1-r_n)x^{r_1} + a_2(r_2-r_n)x^{r_2} + ... + a_{n_1}(r_{n-1} - r_n)x^{r_{n-1}} = 0

We now have a linear combination of the terms x^{r_1},x^{r_2},...,x^{r_{n-1}}, and this linear combination is equal to 0. Now the induction hypothesis says that any such linear combination implies that all the coefficients are zero (i.e., the terms of the combination are linearly independent) ...

We can, for instance, write:

a_i' = a_i(r_i-r_n)

to get:

a_1'x^{r_1} + a_2'x^{r_2} + ... + a_{n-1}'x^{r_{n-1}} = 0

which, by the induction hypothesis, implies that:

a_1' = ... = a_{n-1}' = 0

Because a_i' = a_i(r_i-r_n) = 0 and because r_i \neq r_n for 1 \leq i < n, we conclude that a_i = 0 for 1 \leq i < n. Thus, if we are given:

a_1x^{r_1} + a_2x^{r_2} + ... + a_nx^{r_n} = 0

we can conclude that:

a_1 = ... = a_{n-1} = 0

and so that

a_nx^{r_n} = 0

But, as already argued above, x \neq 0, we must have: a_n=0. Thus, if

a_1x^{r_1} + ... + a_nx^{r_n} = 0

then that a_1 = ... = a_n = 0, or that the functions are linearly independent.
 
Looks okay to me now.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
14
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K