Linear Dependence in High-Dimensional Vector Spaces

Click For Summary
SUMMARY

The discussion centers on the concept of linear dependence in high-dimensional vector spaces, specifically addressing the conditions under which a set of vectors S = {v1, v2, ..., vn} is considered linearly dependent. It is established that S is linearly dependent if at least one vector in S can be expressed as a linear combination of the others. The conversation also explores the implications of having one or more vectors in S, clarifying that if S contains more vectors than the dimension of the space, linear dependence is guaranteed. The geometric interpretation of these concepts is emphasized, particularly in relation to spans and dimensions.

PREREQUISITES
  • Understanding of linear combinations and vector spaces
  • Familiarity with the concept of span in linear algebra
  • Knowledge of the definition of linear independence and dependence
  • Basic geometric interpretation of vectors in Rn
NEXT STEPS
  • Study the properties of vector spaces and bases in linear algebra
  • Learn about the Rank-Nullity Theorem and its implications for linear dependence
  • Explore the concept of dimension in relation to linear independence
  • Investigate applications of linear dependence in computational problems and data analysis
USEFUL FOR

Students of linear algebra, mathematicians, and anyone interested in understanding the foundational concepts of vector spaces and their applications in higher dimensions.

QuarkCharmer
Messages
1,049
Reaction score
3

Homework Statement


Let S = {v_{1}, v_{2}, \cdots , v_{n}}
S is linear dependent iff at least one v in S is a linear combination of the others.

Homework Equations



The Attempt at a Solution



From here on, just take v to be a vector, and x to be some scalar please.

I really just wanted to check my understanding of this.

If I generalize this to the case where S contains 1 vector v, then S is linear independent iff v is not the zero vector. This is because if you write v as a linear combination xv, then xv=0 has only the trivial solution where v!=0. Likewise, if v=0, then x could be any real number in xv=0, and there are infinitely many non-trivial solutions (linearly dependent).

This all makes sense from a geometric standpoint to me. I am more concerned about the case where S contains 1+n vectors.

S is linearly dependent iff at least one v in S is a linear combination of the others.

So, if S = {v,u,w}, and w is a linear combination of v and u, then w is in span{v,u} and S is linearly dependent. The same case can be made for R^3 without any issue. I am having trouble checking whether this is true for R^n.

For instance, if S = {a,b,c,d}, and a,b,c,d each lie on a line through the different axis, then span{S} is some 4d surface thingy. a,b,c lie on a line through three different axis, and d is some linear combination of a,b,c? Clearly then, by that theorem, S is linearly dependent. So any time there is a linear dependence between any 2 vectors in a set, the set is linearly dependent? Regardless of dimension?
 
Last edited:
Physics news on Phys.org
A set of vectors is linearly dependent if you can write c1*v1+...cn*vn=0 with not all of the cn equal 0. Why don't you try using that?
 
QuarkCharmer said:

Homework Statement


Let S = {v_{1}, v_{2}, \cdots , v_{n}}
S is linear dependent iff at least one v in S is a linear combination of the others.

Homework Equations



The Attempt at a Solution



From here on, just take v to be a vector, and x to be some scalar please.

I really just wanted to check my understanding of this.

If I generalize this to the case where S contains 1 vector v, then S is linear independent iff v is not the zero vector. This is because if you write v as a linear combination xv, then xv=0 has only the trivial solution where v!=0. Likewise, if v=0, then x could be any real number in xv=0, and there are infinitely many non-trivial solutions (linearly dependent).

This all makes sense from a geometric standpoint to me. I am more concerned about the case where S contains 1+n vectors.
You seem to be leaving out some information here. Before, you simply noted that you had a set of n vectors. Now, making it n+ 1 doesn't change anything. In this last part, are we to assume that these vectors are in a vector space of dimension n?

If that is the case, then the definition of "dimension" says that there exist a basis for the space containing n vectors. Writing the n+1 vectors in terms of the basis vectors gives n+1 equations in terms of n coefficients.

S is linearly dependent iff at least one v in S is a linear combination of the others.

So, if S = {v,u,w}, and w is a linear combination of v and u, then w is in span{v,u} and S is linearly dependent. The same case can be made for R^3 without any issue. I am having trouble checking whether this is true for R^n.

For instance, if S = {a,b,c,d}, and a,b,c,d each lie on a line through the different axis, then span{S} is some 4d surface thingy. a,b,c lie on a line through three different axis, and d is some linear combination of a,b,c? Clearly then, by that theorem, S is linearly dependent. So any time there is a linear dependence between any 2 vectors in a set, the set is linearly dependent? Regardless of dimension?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
15
Views
3K
Replies
12
Views
2K
Replies
34
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 13 ·
Replies
13
Views
2K