Linear Dependence in High-Dimensional Vector Spaces

QuarkCharmer
Messages
1,049
Reaction score
3

Homework Statement


Let S = {v_{1}, v_{2}, \cdots , v_{n}}
S is linear dependent iff at least one v in S is a linear combination of the others.

Homework Equations



The Attempt at a Solution



From here on, just take v to be a vector, and x to be some scalar please.

I really just wanted to check my understanding of this.

If I generalize this to the case where S contains 1 vector v, then S is linear independent iff v is not the zero vector. This is because if you write v as a linear combination xv, then xv=0 has only the trivial solution where v!=0. Likewise, if v=0, then x could be any real number in xv=0, and there are infinitely many non-trivial solutions (linearly dependent).

This all makes sense from a geometric standpoint to me. I am more concerned about the case where S contains 1+n vectors.

S is linearly dependent iff at least one v in S is a linear combination of the others.

So, if S = {v,u,w}, and w is a linear combination of v and u, then w is in span{v,u} and S is linearly dependent. The same case can be made for R^3 without any issue. I am having trouble checking whether this is true for R^n.

For instance, if S = {a,b,c,d}, and a,b,c,d each lie on a line through the different axis, then span{S} is some 4d surface thingy. a,b,c lie on a line through three different axis, and d is some linear combination of a,b,c? Clearly then, by that theorem, S is linearly dependent. So any time there is a linear dependence between any 2 vectors in a set, the set is linearly dependent? Regardless of dimension?
 
Last edited:
Physics news on Phys.org
A set of vectors is linearly dependent if you can write c1*v1+...cn*vn=0 with not all of the cn equal 0. Why don't you try using that?
 
QuarkCharmer said:

Homework Statement


Let S = {v_{1}, v_{2}, \cdots , v_{n}}
S is linear dependent iff at least one v in S is a linear combination of the others.

Homework Equations



The Attempt at a Solution



From here on, just take v to be a vector, and x to be some scalar please.

I really just wanted to check my understanding of this.

If I generalize this to the case where S contains 1 vector v, then S is linear independent iff v is not the zero vector. This is because if you write v as a linear combination xv, then xv=0 has only the trivial solution where v!=0. Likewise, if v=0, then x could be any real number in xv=0, and there are infinitely many non-trivial solutions (linearly dependent).

This all makes sense from a geometric standpoint to me. I am more concerned about the case where S contains 1+n vectors.
You seem to be leaving out some information here. Before, you simply noted that you had a set of n vectors. Now, making it n+ 1 doesn't change anything. In this last part, are we to assume that these vectors are in a vector space of dimension n?

If that is the case, then the definition of "dimension" says that there exist a basis for the space containing n vectors. Writing the n+1 vectors in terms of the basis vectors gives n+1 equations in terms of n coefficients.

S is linearly dependent iff at least one v in S is a linear combination of the others.

So, if S = {v,u,w}, and w is a linear combination of v and u, then w is in span{v,u} and S is linearly dependent. The same case can be made for R^3 without any issue. I am having trouble checking whether this is true for R^n.

For instance, if S = {a,b,c,d}, and a,b,c,d each lie on a line through the different axis, then span{S} is some 4d surface thingy. a,b,c lie on a line through three different axis, and d is some linear combination of a,b,c? Clearly then, by that theorem, S is linearly dependent. So any time there is a linear dependence between any 2 vectors in a set, the set is linearly dependent? Regardless of dimension?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top