Proving a set of vectors are linearly independent

In summary, the author is not sure how to prove that a set of vectors are linearly independent. He was asked to provide a summary of how to do this, but is not sure if his reasoning is good enough. If the vectors are linearly independent, then the procedure is good enough.
  • #1
sakodo
21
0
Hi, I came across a question where I needed to prove that a set of vectors are linearly independent. The thing is, I am not sure how to reason the proof properly.

Say you have three vectors x1,x2,x3 E R3, and prove that they are linearly independent.

Put them into a 3x3 matrix A, row-reduce, if all the columns have leading terms, because there are no non-leading columns, the only solution of Ax=0 is when x=(0,0,0). Thus, x1,x2,x3 are linearly independent.

Is this reasoning good enough? I feel like I am missing something.

Any help would be appreciated.
 
Physics news on Phys.org
  • #2
I think you'll have to put some restrictions on your vectors. Right now, it seems to me that the vectors x1, x2, and x3 can be any vectors in R3. However, won't you need to put a bit more restriction on that? What if I give x1=<1,2,3> and x2=<2,4,6>... then they aren't linearly independent.

I hope you got some advice from all that babble.
 
  • #3
Thanks for the reply Char.Limit.

Assuming that x1,x2,x3 are indeed linearly independent, is my reasoning good enough? I am not sure if my proof is sufficient.

Thanks.
 
  • #4
You can't assume they are independent, if this is what you need to show.

If these are some given vectors, then your procedure is OK, since x = (0, 0, 0) represents the coefficients of your linear combination.
 
  • #5
Yeah sorry I didn't put it clearly. What I meant was x1,x2,x3 are given vectors.

If the row-reduced matrix has no non-leading terms, you can deduce that the vectors are linearly independent already. Its just I don't know how to set up the proof properly lol.
 
  • #6
If the vectors are not independent, then row reduction will result in the last row being all "0"s and vice versa. That's essentially what you mean by "non-leading term" isn't it?
 
  • #7
HallsofIvy said:
If the vectors are not independent, then row reduction will result in the last row being all "0"s and vice versa. That's essentially what you mean by "non-leading term" isn't it?

Yeah is that the right term? If a matrix has a row of zeros then it has no leading terms. It was either leading term or leading column. Sorry I forgot the exact name for it.
 

1. How do you prove that a set of vectors are linearly independent?

To prove that a set of vectors are linearly independent, you can use the definition of linear independence. This means that you must show that the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is when all the coefficients c1, c2, ..., cn are equal to zero. You can also use other methods, such as Gaussian elimination or rank-nullity theorem, to prove linear independence.

2. Can a set of linearly independent vectors contain a zero vector?

No, a set of linearly independent vectors cannot contain a zero vector. This is because the zero vector is not considered a linearly independent vector, as it can be formed by a linear combination of any other vector in the set.

3. Is the number of vectors in a linearly independent set always equal to the dimension of the vector space?

Not necessarily. The number of vectors in a linearly independent set can be less than, equal to, or greater than the dimension of the vector space. It depends on the specific set of vectors and the vector space they belong to.

4. Can a set of linearly dependent vectors be made linearly independent by adding or removing vectors?

Yes, a set of linearly dependent vectors can be made linearly independent by removing some vectors or by adding linearly independent vectors to the set. However, it is important to note that this does not always hold true. The resulting set of vectors must still satisfy the definition of linear independence.

5. How does linear independence relate to the concept of a basis for a vector space?

A basis for a vector space is a set of linearly independent vectors that span the entire vector space. This means that any vector in the vector space can be written as a linear combination of the basis vectors. In other words, the basis vectors form a "framework" for the vector space, and linear independence is a crucial property for this framework to exist.

Similar threads

  • Calculus and Beyond Homework Help
Replies
26
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
254
  • Calculus and Beyond Homework Help
Replies
2
Views
969
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
948
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
927
  • Linear and Abstract Algebra
Replies
6
Views
845
Back
Top