# Finding maximum number of functions that can be linearly independent

1. Oct 21, 2014

### QuantumCurt

1. The problem statement, all variables and given/known data

Hey everyone. I'm in an introductory differential equations class, and I think this homework problem has got me stumped.

The functions y1(x), y2(x), ... , yn(x) are linearly independent on an interval I. c1y1(x)+c2y2(x)+...+cnyn(x)=0 for all x in I, implies that c1=c2=...=cn=0. What is the maximum number of functions that can be linearly independent?

y1=1, y2=1+x, y3=x2, y4=x(1-x), y5=x

Me and a friend concluded that this was more of a conceptual question. Since it's already defined as being a set of linearly independent functions, we figured that the max number of linearly independent functions must be "n many functions" since it goes up to cn.

I also tried simply adding all of the functions together and setting it equal to zero, then solving for x. This gave me an answer of -2/3, which clearly isn't a logical answer here.

Am I using the right logic in that this is simply a conceptual question, or am I missing something? Do I need to use the Wronskian or do something otherwise involving matrices? Any help would be very much appreciated. :)

2. Oct 21, 2014

### RUber

You are correct in saying that this is a conceptual question.
Do you have any restrictions on what types of functions are allowed in the space?
If you consider just powers of x, is there a way to add a+bx so that they are equal to zero for all x?
How about higher powers? Is there a maximum power that could be independent such that any higher power of x could be resolved as the sum of lower order terms?

3. Oct 21, 2014

### RUber

If so, try to see if any one can be made up of the others.
$\sum_{i=1}^n c_i y_i =0 \implies c_i=0 \forall i= 1...n$ is linearly independent.
Similarly, if $y_n$ is linearly dependent, then there exist coefficients
$c_i, \, i= 1...n-1$ such that $y_n =\sum_{i=1}^{n-1} c_i y_i$
Which would mean that
$-1y_n+\sum_{i=1}^{n-1} c_i y_i=0$

4. Oct 21, 2014

### QuantumCurt

I worded the problem exactly as it's written on the worksheet. The wording of the problem seems rather confusing to me, as we've seen nothing even remotely like this in my course. I guess I'm just asking about the 5 given functions. I'm not aware of any restrictions on functions that can be used. There's no mention of it in the problem.

Adding a+bx? Are you referring to a method of undetermined coefficients? If so, I don't think that's the method we're supposed to be using. We just saw those for the first time yesterday, and this assignment was given about a week ago. When you say to add a+bx so that they're equal to 0 for all x, are you referring to combinations of the 5 functions, or adding a+bx in such a way that each individual function is equal to 0 for all x?

5. Oct 21, 2014

### Dick

The first sentence of the problem statement is just defining what linearly independent means. It isn't telling you that the functions are given are linearly independent. You are supposed to figure out the maximum number of function in that set can be linearly independent.

6. Oct 21, 2014

### QuantumCurt

Is that what it's asking? That makes a lot of sense. That seems like a far more logical type of question to ask. If that's the case, this is a much simpler matter. Thanks for pointing that out. This explains my confusion.

7. Oct 21, 2014

### QuantumCurt

Am I right in thinking that the Wronskian is going to be my best bet for solving this? Should I set up a matrix with all 5 functions?

8. Oct 21, 2014

### QuantumCurt

$W(y_1, y_2, y_3, y_4, y_5)=\begin{vmatrix} 1&1+x&x^2&x-x^2&x\\ 0&1&2x&1-2x&1\\ 0&0&2&-2&0\\ 0&0&0&0&0\\ 0&0&0&0&0 \end{vmatrix}$

And then just find the determinant of this 5x5 matrix? That sounds like a rather daunting task. It's seeming like it may be simpler without the Wronskian. Any thoughts?

9. Oct 21, 2014

### QuantumCurt

Okay, forget the Wronskian. I was over complicating that. This is where I'm at now.

$c_1+c_2(1+x)+c_3x^2+c_4x(1-x)+c_5x=0$

Now I just need to examine it to determine if any of the functions are constant multiples of another function, right?

That being the case, there are no functions in this set that are constant multiples of another one of the functions, thus proving that all 5 of the functions are linearly independent? That seems too simple.

edit-

I'm concluding via this relation that all 5 of the functions are linearly independent, but when I plug this matrix into Wolfram and find the determinant, it's coming out as equaling 0. This would imply linear dependence. So I'm getting different answers via the different methods. What am I doing wrong?

The matrix in Wolfram - https://www.wolframalpha.com/input/...0,+2,+-2,+0},{0,+0,+0,+0,+0},{0,+0,+0,+0,+0}}

Last edited: Oct 22, 2014
10. Oct 22, 2014

### vela

Staff Emeritus
You're trying to determine if any of the functions can be written as a linear combination of the others, not just multiples of the others.

The determinant of that matrix is obviously 0 because there's a row of zeros.

11. Oct 22, 2014

### QuantumCurt

(sorry for posting so many times in a row)

I think I'm figuring out where I went wrong. If I take the first and second derivative of the relation -

$c_1+c_2(1+x)+c_3x^2+c_4x(1-x)+c_5x=0$

I get -

$0+c_2+2c_3x+c_4(1-2x)+c_5=0$

and

$0+0+2c_3+0-2c_4+0=0$

Which implies that

$2c_3=2c_4$
and that
$c_3=c_4$

So this implies that at least functions 3 and 4 are linearly independent, correct?

12. Oct 22, 2014

### QuantumCurt

Since $$y_1+y_5=y_2$$, or
$1+x=1+x$

This implies a linear combination that can equal zero, correct? Does this imply that functions 1, 2, and 5 are linearly independent?

13. Oct 22, 2014

### vela

Staff Emeritus
No, it says $y_1$, $y_2$, and $y_5$ are linearly dependent. Independence requires that $c_1 y_1 + c_2 y_2 + c_3 y_5 = 0$ has as its only solution $c_1 = c_2 = c_3 = 0$, but you've just shown that $c_1 = 1$, $c_2 = -1$, and $c_3 = 1$ is a possible solution.

14. Oct 22, 2014

### QuantumCurt

Okay, I see. I was mixing the terms up. It's been a long night. -_-

So since these three functions are linearly dependent, and functions 3 and 4 are linearly independent, would I say that the maximum number of linearly independent functions is 2? Or is there still more to it? I'm not seeing any other combinations that would work.

15. Oct 22, 2014

### Staff: Mentor

I can see by inspection that y1 = 1, y2 = 1 + x, and y3 = x2 are linearly independent. That's three functions.

16. Oct 22, 2014

### QuantumCurt

I'm not seeing this. How are they linearly independent?

I don't think my instructor ever adequately explained how to figure this out, and my textbook barely even touches on it.

Would I take a linear combination such that $$x^2+x+2=0$$, which shows that the function has zeroes for constant multiples of 1?

$c_1+c_2+c_2x+c_3x^2$

Given constant values of 1, this becomes the aforementioned quadratic function with non-zero roots.

edit-never mind. It's getting late and I'm not thinking right. That function would only have non-real roots.

Last edited: Oct 22, 2014
17. Oct 22, 2014

### Staff: Mentor

That's not the question to ask. The question to ask is, Does the equation c1 * 1 + c2 * (1 + x) + c2 * x2 = 0 have any nontrivial solutions for the constants?

Every such equation always has the trivial solution, c1 = c2 = c3 = 0, whether the functions are linearly independent or linearly dependent. For these three functions to be linearly independent, the trivial solution is the only solution, independent of x. Note the "for all x in I" phrase in the definition you wrote in post #1.

The equation you wrote, x2 + x + 1 = 0 is irrelevant to the discussion, since the equation for linear independence has to hold for all x. (It's also not relevant that this equation doesn't have real solutions.)
Again, the roots of a polynomial have nothing to do with this.

18. Oct 22, 2014

### RUber

Linear independence is not that they can't equal zero ever, but that they don't equal zero always.
With the functions provided, this is addition. Start with the first. One function is always linearly independent.
Check if the second can be written as a constant multiple of the first.
Then check if the 3rd can be written as a sum of constants times the first 2, etc.
Alternatively, write the system as a matrix and row reduce. The number of non-zero lines is the number of linearly independent functions.

19. Oct 22, 2014

### QuantumCurt

Okay, I get what you're saying. I'm just not sure how to go about proving that it doesn't have any nontrivial solutions. I don't see any way of solving this equation to show that. Is a solution to the equation even a part of this process, or is it more of a process of inspection?

20. Oct 22, 2014

### QuantumCurt

This may be a technique that we haven't looked at yet. This doesn't sound like anything that we've seen in this course yet.

In either case, I'm done for the night. It's bedtime. Thanks for the help, everyone. I'll tackle it again tomorrow.