Proving Linear Independence in Vector Spaces with Ordered Sets

In summary: Hm, but maybe it's possible that this was unformally implied by the question. Honestly, I don't see any other way out here. If you could give me further tips, I'd be very grateful, but this is just the beginning of my linear algebra course, so I'm still a bit...
  • #1
radou
Homework Helper
3,149
8
I need to check the proof of the proposition below we got for homework, thanks in advance!

Proposition. Let V be a vector space over a field F, and [tex]S = \left\{a_{1}, \cdots, a_{k}\right\}\subset V, k\geq 2[/tex]. If the set S is linearly dependent, and [tex]a_{1} \neq 0[/tex], and if we assume there is an order on S, then there exists at least one element of S which can be shown as a linear combination of its predecessors.

Proof [?]. If S is dependent, then there exists an element in S that can be shown as a linear combination of the rest of vectors from S, so:
[tex]a_{j+1}=\alpha_{1}a_{1}+\cdots+\alpha_{j}a_{j}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}[/tex]. (*) Further on, let's assume that the vector [tex]a_{j}[/tex] can be shown as:
[tex]a_{j}=\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1}[/tex]. So, after plugging [tex]a_{j}[/tex] into the equation (*), we get:
[tex]a_{j+1}= \alpha_{1}a_{1}+\cdots+\alpha_{j}(\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1})+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}[/tex], which implies [tex]a_{j+1}=\gamma_{1}a_{1}+\cdots+a_{j-1}\gamma_{j-1}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}[/tex]. We assumed that [tex]a_{j+1}[/tex] is a the combination of all vectors in S, so, since the linear combination does not explicitly contain the vector [tex]a_{j}[/tex], we conclude that [tex]a_{j}[/tex] must be a linear combination of the vectors [tex]\left\{a_{1}, \cdots, a_{j-1}\right\}[/tex] with the coefficients [tex]\gamma_{i}[/tex].

Gee, I have the feeling I missed something big here. :uhh:

P.S. The thread should be called 'Linear dependence proof' or sth like that, but nevermind.
 
Last edited:
Physics news on Phys.org
  • #2
I think you are overcomplicating it. Just show they add to zero with not all coefficients zero, and then find the first non-zero coefficient starting on the right side, and move this term to the other side of the equation. You should be able to figure it out from there.
 
  • #3
gonzo said:
I think you are overcomplicating it. Just show they add to zero with not all coefficients zero, and then find the first non-zero coefficient starting on the right side, and move this term to the other side of the equation. You should be able to figure it out from there.

Thanks for the advice, I'll look at it later, but I'd still like to know if the proof I did is valid. :smile:
 
  • #4
No, it isn't valid. The 'proof' doesn't not even make any statement about any ordering on the elements a_i, does it?
 
  • #5
matt grime said:
No, it isn't valid. The 'proof' doesn't not even make any statement about any ordering on the elements a_i, does it?

Ok, I'm lost now. Any hints would be appreciated. It seems that all that is correct is equation (*), which says that the set S is dependent. Further on, I know that [tex]a_{1} \neq 0[/tex], and I know that there is an ordering, i.e. one knows which element of S follows after the one before.
 
  • #6
You know there is a subset of the a_i, and a non-trivial relation between them, i.e. one where all coeffecients are not zero. Now what do you need to do?
 
  • #7
Ok, I may have got it now. Since S is dependent, there exist [tex]\alpha_{1}, \cdots, \alpha_{k} \in F[/tex] such that at least one of them is non-zero. Let's assume [tex]\alpha_{k} \neq 0[/tex]. So, from
[tex]\alpha_{1}a_{1}+\cdots+\alpha_{k}a_{k}=0[/tex] we have
[tex]-\alpha_{k}a_{k}=\alpha_{1}a_{1}+\cdots+\alpha_{k-1}a_{k-1}[/tex], which we can multiply by [tex]-\frac{1}{\alpha_{k}}[/tex], so [tex]a_{k}[/tex] can be shown as a linear combination of its predecessors. Now, if S is such a set where [tex]a_{k} = 0[/tex], then we can show in the same way that [tex]a_{k-1}[/tex] can be written as a linear combination of its predecessors, and so on. If the set has 2 elements (for k = 2), the element [tex]a_{2}[/tex] can be shown as a linear combination of [tex]a_{1}[/tex], since [tex]a_{1} \neq 0[/tex]. I hope I got this right now.
 
  • #8
Apart from the fact that there is nothing to suppose that the order on the a_i's is that inherited from the subbscripts...
 
  • #9
matt grime said:
Apart from the fact that there is nothing to suppose that the order on the a_i's is that inherited from the subbscripts...

The order is irrelevant, it's just important that the a_i's are predecessors. But I feel you wanted to imply something else..
 
  • #10
You are assuming that a_k's predecessors are the a_i with i<k. The question does not imply that at all. It just says they are ordered, somehow.
 
  • #11
matt grime said:
You are assuming that a_k's predecessors are the a_i with i<k. The question does not imply that at all. It just says they are ordered, somehow.

Hm, but maybe it's possible that this was unformally implied by the question. Honestly, I don't see any other way out here. If you could give me further tips, I'd be very grateful, but this is just the beginning of my linear algebra course, so I'm still a bit lost.
 
  • #12
But you've done the linear algebra part. The ordering has nothing to do with the linear algebra.

You can pick a subset of the a_i and non-zero coefficients b_i so that the sum b_ia_i is zero. Now, you picked out the one that is largest in the ordering by using the subscripts. Well, since we aren't actually told that it is the subscripts we're using (not as written above), that is not what we should do. We just need to pick out the largest in whatever the ordering is. In particular you may, after relabelling, assume that the ordering on the a_i is precisely the one given by the subscripts. But none of this bit of the question has anything to do with linear algebra.
 
  • #13
I get it, thanks for your help!
 

What is a linear independence proof?

A linear independence proof is a mathematical method used to determine whether a set of vectors in a vector space are independent or not. It involves showing that a linear combination of the vectors cannot equal zero unless all the coefficients are zero.

Why is it important to prove linear independence?

Proving linear independence is important because it allows us to determine the dimension of a vector space, which is a crucial aspect in many areas of mathematics and science. It also helps in solving systems of linear equations and understanding the behavior of matrices.

What is the process of proving linear independence?

The process of proving linear independence involves setting up a linear combination of the vectors in question and then showing that the only solution is when all the coefficients are zero. This can be done through various methods such as Gaussian elimination, matrix operations, or using the definition of linear independence.

What are some common techniques used in linear independence proofs?

Some common techniques used in linear independence proofs include Gaussian elimination, which involves transforming a matrix into reduced row echelon form, and using the definition of linear independence to show that a linear combination of vectors cannot equal zero unless all coefficients are zero.

Are there any tips for successfully proving linear independence?

Some tips for successfully proving linear independence include carefully setting up the linear combination and being familiar with the properties of vector spaces. It is also helpful to practice and work through different examples to gain a better understanding of the process.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
964
  • Linear and Abstract Algebra
Replies
5
Views
985
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
905
  • Linear and Abstract Algebra
Replies
8
Views
771
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
873
Back
Top