Recognitions:
Homework Help

## Linear independence proof

I need to check the proof of the proposition below we got for homework, thanks in advance!

Proposition. Let V be a vector space over a field F, and $$S = \left\{a_{1}, \cdots, a_{k}\right\}\subset V, k\geq 2$$. If the set S is linearly dependent, and $$a_{1} \neq 0$$, and if we assume there is an order on S, then there exists at least one element of S which can be shown as a linear combination of its predecessors.

Proof [?]. If S is dependent, then there exists an element in S that can be shown as a linear combination of the rest of vectors from S, so:
$$a_{j+1}=\alpha_{1}a_{1}+\cdots+\alpha_{j}a_{j}+\alpha_{j+2}a_{j+2}+\cdo ts+\alpha_{k}a_{k}$$. (*) Further on, let's assume that the vector $$a_{j}$$ can be shown as:
$$a_{j}=\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1}$$. So, after plugging $$a_{j}$$ into the equation (*), we get:
$$a_{j+1}= \alpha_{1}a_{1}+\cdots+\alpha_{j}(\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1})+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}$$, which implies $$a_{j+1}=\gamma_{1}a_{1}+\cdots+a_{j-1}\gamma_{j-1}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}$$. We assumed that $$a_{j+1}$$ is a the combination of all vectors in S, so, since the linear combination does not explicitly contain the vector $$a_{j}$$, we conclude that $$a_{j}$$ must be a linear combination of the vectors $$\left\{a_{1}, \cdots, a_{j-1}\right\}$$ with the coefficients $$\gamma_{i}$$.

Gee, I have the feeling I missed something big here.

P.S. The thread should be called 'Linear dependence proof' or sth like that, but nevermind.
 PhysOrg.com science news on PhysOrg.com >> City-life changes blackbird personalities, study shows>> Origins of 'The Hoff' crab revealed (w/ Video)>> Older males make better fathers: Mature male beetles work harder, care less about female infidelity
 I think you are overcomplicating it. Just show they add to zero with not all coefficients zero, and then find the first non-zero coefficient starting on the right side, and move this term to the other side of the equation. You should be able to figure it out from there.

Recognitions:
Homework Help
 Quote by gonzo I think you are overcomplicating it. Just show they add to zero with not all coefficients zero, and then find the first non-zero coefficient starting on the right side, and move this term to the other side of the equation. You should be able to figure it out from there.
Thanks for the advice, I'll look at it later, but I'd still like to know if the proof I did is valid.

Recognitions:
Homework Help

## Linear independence proof

No, it isn't valid. The 'proof' doesn't not even make any statement about any ordering on the elements a_i, does it?

Recognitions:
Homework Help
 Quote by matt grime No, it isn't valid. The 'proof' doesn't not even make any statement about any ordering on the elements a_i, does it?
Ok, I'm lost now. Any hints would be appreciated. It seems that all that is correct is equation (*), which says that the set S is dependent. Further on, I know that $$a_{1} \neq 0$$, and I know that there is an ordering, i.e. one knows which element of S follows after the one before.
 Recognitions: Homework Help Science Advisor You know there is a subset of the a_i, and a non-trivial relation between them, i.e. one where all coeffecients are not zero. Now what do you need to do?
 Recognitions: Homework Help Ok, I may have got it now. Since S is dependent, there exist $$\alpha_{1}, \cdots, \alpha_{k} \in F$$ such that at least one of them is non-zero. Let's assume $$\alpha_{k} \neq 0$$. So, from $$\alpha_{1}a_{1}+\cdots+\alpha_{k}a_{k}=0$$ we have $$-\alpha_{k}a_{k}=\alpha_{1}a_{1}+\cdots+\alpha_{k-1}a_{k-1}$$, which we can multiply by $$-\frac{1}{\alpha_{k}}$$, so $$a_{k}$$ can be shown as a linear combination of its predecessors. Now, if S is such a set where $$a_{k} = 0$$, then we can show in the same way that $$a_{k-1}$$ can be written as a linear combination of its predecessors, and so on. If the set has 2 elements (for k = 2), the element $$a_{2}$$ can be shown as a linear combination of $$a_{1}$$, since $$a_{1} \neq 0$$. I hope I got this right now.
 Recognitions: Homework Help Science Advisor Apart from the fact that there is nothing to suppose that the order on the a_i's is that inherited from the subbscripts....

Recognitions:
Homework Help
 Quote by matt grime Apart from the fact that there is nothing to suppose that the order on the a_i's is that inherited from the subbscripts....
The order is irrelevant, it's just important that the a_i's are predecessors. But I feel you wanted to imply something else..
 Recognitions: Homework Help Science Advisor You are assuming that a_k's predecessors are the a_i with i

Recognitions:
Homework Help
 Quote by matt grime You are assuming that a_k's predecessors are the a_i with i
Hm, but maybe it's possible that this was unformally implied by the question. Honestly, I don't see any other way out here. If you could give me further tips, I'd be very grateful, but this is just the beginning of my linear algebra course, so I'm still a bit lost.
 Recognitions: Homework Help Science Advisor But you've done the linear algebra part. The ordering has nothing to do with the linear algebra. You can pick a subset of the a_i and non-zero coefficients b_i so that the sum b_ia_i is zero. Now, you picked out the one that is largest in the ordering by using the subscripts. Well, since we aren't actually told that it is the subscripts we're using (not as written above), that is not what we should do. We just need to pick out the largest in whatever the ordering is. In particular you may, after relabelling, assume that the ordering on the a_i is precisely the one given by the subscripts. But none of this bit of the question has anything to do with linear algebra.
 Recognitions: Homework Help I get it, thanks for your help!

 Similar discussions for: Linear independence proof Thread Forum Replies Calculus & Beyond Homework 2 Calculus & Beyond Homework 8 Linear & Abstract Algebra 3 Calculus & Beyond Homework 4 Linear & Abstract Algebra 2