# Linear independence proof (1 Viewer)

### Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Homework Helper
I need to check the proof of the proposition below we got for homework, thanks in advance!

Proposition. Let V be a vector space over a field F, and $$S = \left\{a_{1}, \cdots, a_{k}\right\}\subset V, k\geq 2$$. If the set S is linearly dependent, and $$a_{1} \neq 0$$, and if we assume there is an order on S, then there exists at least one element of S which can be shown as a linear combination of its predecessors.

Proof [?]. If S is dependent, then there exists an element in S that can be shown as a linear combination of the rest of vectors from S, so:
$$a_{j+1}=\alpha_{1}a_{1}+\cdots+\alpha_{j}a_{j}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}$$. (*) Further on, let's assume that the vector $$a_{j}$$ can be shown as:
$$a_{j}=\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1}$$. So, after plugging $$a_{j}$$ into the equation (*), we get:
$$a_{j+1}= \alpha_{1}a_{1}+\cdots+\alpha_{j}(\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1})+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}$$, which implies $$a_{j+1}=\gamma_{1}a_{1}+\cdots+a_{j-1}\gamma_{j-1}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}$$. We assumed that $$a_{j+1}$$ is a the combination of all vectors in S, so, since the linear combination does not explicitly contain the vector $$a_{j}$$, we conclude that $$a_{j}$$ must be a linear combination of the vectors $$\left\{a_{1}, \cdots, a_{j-1}\right\}$$ with the coefficients $$\gamma_{i}$$.

Gee, I have the feeling I missed something big here. :uhh:

P.S. The thread should be called 'Linear dependence proof' or sth like that, but nevermind.

Last edited:

#### gonzo

I think you are overcomplicating it. Just show they add to zero with not all coefficients zero, and then find the first non-zero coefficient starting on the right side, and move this term to the other side of the equation. You should be able to figure it out from there.

Homework Helper
gonzo said:
I think you are overcomplicating it. Just show they add to zero with not all coefficients zero, and then find the first non-zero coefficient starting on the right side, and move this term to the other side of the equation. You should be able to figure it out from there.
Thanks for the advice, I'll look at it later, but I'd still like to know if the proof I did is valid.

#### matt grime

Homework Helper
No, it isn't valid. The 'proof' doesn't not even make any statement about any ordering on the elements a_i, does it?

Homework Helper
matt grime said:
No, it isn't valid. The 'proof' doesn't not even make any statement about any ordering on the elements a_i, does it?
Ok, I'm lost now. Any hints would be appreciated. It seems that all that is correct is equation (*), which says that the set S is dependent. Further on, I know that $$a_{1} \neq 0$$, and I know that there is an ordering, i.e. one knows which element of S follows after the one before.

#### matt grime

Homework Helper
You know there is a subset of the a_i, and a non-trivial relation between them, i.e. one where all coeffecients are not zero. Now what do you need to do?

Homework Helper
Ok, I may have got it now. Since S is dependent, there exist $$\alpha_{1}, \cdots, \alpha_{k} \in F$$ such that at least one of them is non-zero. Let's assume $$\alpha_{k} \neq 0$$. So, from
$$\alpha_{1}a_{1}+\cdots+\alpha_{k}a_{k}=0$$ we have
$$-\alpha_{k}a_{k}=\alpha_{1}a_{1}+\cdots+\alpha_{k-1}a_{k-1}$$, which we can multiply by $$-\frac{1}{\alpha_{k}}$$, so $$a_{k}$$ can be shown as a linear combination of its predecessors. Now, if S is such a set where $$a_{k} = 0$$, then we can show in the same way that $$a_{k-1}$$ can be written as a linear combination of its predecessors, and so on. If the set has 2 elements (for k = 2), the element $$a_{2}$$ can be shown as a linear combination of $$a_{1}$$, since $$a_{1} \neq 0$$. I hope I got this right now.

#### matt grime

Homework Helper
Apart from the fact that there is nothing to suppose that the order on the a_i's is that inherited from the subbscripts....

Homework Helper
matt grime said:
Apart from the fact that there is nothing to suppose that the order on the a_i's is that inherited from the subbscripts....
The order is irrelevant, it's just important that the a_i's are predecessors. But I feel you wanted to imply something else..

#### matt grime

Homework Helper
You are assuming that a_k's predecessors are the a_i with i<k. The question does not imply that at all. It just says they are ordered, somehow.

Homework Helper
matt grime said:
You are assuming that a_k's predecessors are the a_i with i<k. The question does not imply that at all. It just says they are ordered, somehow.
Hm, but maybe it's possible that this was unformally implied by the question. Honestly, I don't see any other way out here. If you could give me further tips, I'd be very grateful, but this is just the beginning of my linear algebra course, so I'm still a bit lost.

#### matt grime

Homework Helper
But you've done the linear algebra part. The ordering has nothing to do with the linear algebra.

You can pick a subset of the a_i and non-zero coefficients b_i so that the sum b_ia_i is zero. Now, you picked out the one that is largest in the ordering by using the subscripts. Well, since we aren't actually told that it is the subscripts we're using (not as written above), that is not what we should do. We just need to pick out the largest in whatever the ordering is. In particular you may, after relabelling, assume that the ordering on the a_i is precisely the one given by the subscripts. But none of this bit of the question has anything to do with linear algebra.

Homework Helper
I get it, thanks for your help!

### The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving