Linear Algebra Dimension Proof

  • #1
129
0

Homework Statement



Let [tex]V\subset\,R^n[/tex] be a subpace with [tex]dim(V) = D[/tex]. Prove that any [tex]S[/tex] vectors in [tex]V[/tex] are linearly dependent if [tex]S > D[/tex].

Homework Equations



Rank-nullity theorem?

The Attempt at a Solution



[tex]dim(V) = D[/tex] implies that there are [tex]D[/tex] vectors in a basis for [tex]V[/tex]. If [tex]S > D[/tex] then there will be at least one more vector in the set, which will result in a free variable and thus, infinite solutions; implying that the set will be linearly dependent.
 

Answers and Replies

  • #2
136
0
For the sake of illustration, suppose we are working in [tex]V = R^2, D=2[/tex], and suppose you choose [tex]S=3[/tex] vectors. You desire to prove that these three vectors must be linearly dependent in [tex]R^2[/tex].

Form a linear combination of these vectors. You desire to verify whether or not

[tex]a_1v_1 + a_2v_2 + a_3v_3 = 0[/tex]

holds true when not all [tex]a_i = 0[/tex].

Form a matrix where the columns of the matrix are your S vectors from V. The matrix will then have n rows and s columns. The linear combination above can then be expressed (in the case of our specific example) something like:

[tex]\left(\begin{array}{ccc}v_{11} & v_{12} & v_{13} \\
v_{21} & v_{22} & v_{23} \end{array}\right) \cdot \left(\begin{array}{c}a_1 \\ a_2 \\ a_3 \end{array}\right) = 0[/tex]

where

[tex]v_i = v_{1i}\hat{e_1} + v_{2i}\hat{e_2}[/tex]

for some basis consisting of D vectors.

You can easily generalize this to a matrix with n rows and s columns.

The question arises as to what values of v, a does the above linear system have solutions for?

Clearly, if [tex]a_1=a_2=a_3=0[/tex], the system above holds true.

The question is, does it hold true when not all [tex]a_i = 0[/tex]?

If so, then the three vectors must be linearly dependent, which is what you desire to prove.

The direction you were heading in w/ the rank of a matrix, is probably the right direction to go..
 
  • #3
129
0
Thank you. So say [tex]V = R^n, D=n[/tex] and [tex]S > n[/tex]. I'm trying to determine if there exists a non-trivial solution to the augmented matrix:

[tex]
\left[ {\begin{array}{cccc}
v_{11} & v_{12} & ... & v_{1S} \\
... & ... & ... & ... \\
v_{n1} & v_{n2} & ... & v_{nS} \\
\end{array} } \right|
\left| {\begin{array}{c}
0_{1} \\
... \\
0_{n}
\end{array} } \right]
[/tex]

Which will row-reduce to:

[tex]
\left[ {\begin{array}{cccc}
1 & 0 & ... & x \\
0 & 1 & ... & y \\
... & ... & ... & ... \\
0 & 0 & 1 & z \\
\end{array} } \right|
\left| {\begin{array}{c}
0_{1} \\
... \\
... \\
0_{n}
\end{array} } \right]
[/tex]


Where [tex][x, y ... z][/tex] is some non-zero vector that will arise since [tex]S > n[/tex]. Then the solution set will be

[tex][a_{1}, a_{2} ... a_{S}] = [x, y ... z][/tex]

And the set will be linearly dependent as a result. Or is that just what I had originally?
 
  • #4
124
0
given two vector-(sub)spaces [tex]V \subseteq W[/tex] then [tex]\dim(V) \le \dim(W)[/tex]. You can use this fact to derive a contradiction.
 
Last edited:
  • #5
136
0
I wouldn't even mess around w/ an augmented matrix, or trying to actually, "physically" reduce the matrix into some kind of row-echelon form.

Just consider the linear system:

[tex]A \cdot x = 0[/tex]

A is a k-by-n matrix.. that is, it has k rows and n columns.

x is an n-dimensional vector.

0 is the k-dimensional 0-vector.

This equation is a statement on the linear dependence of the vectors which make up the columns of the matrix A.

If the only "solution" to this equation is the "trivial" solution x=0, then by definition the vectors which make up the columns of A are linearly independent.

If there exist "non-trivial" solutions to this linear system, where [tex]x \neq 0[/tex], then by definition the vetors which make up the columns of A are linearly dependent.

If your problem statement, you have a vector space where dim(V)=k, and you are selecting n vectors from this space where n > k. You want to show that these n vectors must be linearly dependent. You can model your problem in the above system since dim(V)=k, and so every vector v in V can be expressed as a linearly combination of k basis vectors. Choose any basis of k linearly independent vectors, and construct the matrix A as above.

You will have, by construction of your problem, a linear system where k < n.

By construction, you know that rank(A) <= k < n.

If you have a matrix with n columns and k rows, and you know that k < n, i.e., that the rank of the matrix is < n, what you can conclude about the linear dependence of the columns making up the matrix?

I don't want to give away the *whole* answer, but it's a standard result from linear algebra.. probably in your textbook somewhere.

Shilov treats it at the start of Chapter 3 in his classic on linear algebra.
 
  • #6
124
0
given two vector-(sub)spaces [tex]V \subseteq W[/tex] then [tex]\dim(V) \le \dim(W)[/tex]. You can use this fact to derive a contradiction.

I just realised you might need the statement from the first post to prove this claim. Not sure tough
 
  • #7
129
0
If you have a matrix with n columns and k rows, and you know that k < n, i.e., that the rank of the matrix is < n, what you can conclude about the linear dependence of the columns making up the matrix?

I can conclude that the columns must be linearly dependent. I know that the rank is the number of linearly independent columns, so if I've got k of those but k < n columns in total, the leftover column(s) will have to be a non-trivial linear combination of the others. That then implies that the x vector in Ax = 0 will be non-trivial as well, right?

I'm still not sure how that's different from just making an augmented matrix though (sorry; I'm useless at Linear Algebra).
 
  • #8
205
0
This is merely a question of knowing basic vocabulary and definitions.

Proof: Given any basis set {v1, v2, .... vD} in D-Dimensional space, the set is both linearly independent and "spans" the vector space. Now if you introduce a new vector, say vK, then Span({v1,v2, ... , vD, vK}) \subseteq \!\, Span({v1,v2, .... , vD}), where this relation follows because the second set spans the full space of V. Now because it spans the space we have that any vector can be written as a linear combination of D number of vectors:
a1v1 + a2v2 + .... a(D-1)v(D-1) + aDvD = bvK (****): a1,....aD, and b are scalars under the number field. Notice that in this case b = 1.
Now take that extra vector and concatenate it to the set to construc the set under question....
{v1,v2, ... , vD, vK}
Now is it true that c1v1 + c2v2 + .... + cDvD + c(D+1)vK = 0 such that c1 = c2 = .... = cD = c(D+1) = 0 for all such scalars c1,c2,...cD,c(D+1)?
NO! Look at (****) if you don't immediately see it now.
QED.
 
Last edited:
  • #9
136
0
I can conclude that the columns must be linearly dependent. I know that the rank is the number of linearly independent columns, so if I've got k of those but k < n columns in total, the leftover column(s) will have to be a non-trivial linear combination of the others. That then implies that the x vector in Ax = 0 will be non-trivial as well, right?
Yes, essentially.

You mentioned in the OP something about a "rank-nullity" theorem.

I'm not sure exactly what you meant about "rank-nullity" theorem, but yes, the rank of a matrix is the number of linearly independent columns (or rows), and so if the number of rows is k, then you must have rank(A) <= k.. and if you know k < n (strictly less than), where n is the number of columns, then you can conclude that the columns of the matrix must be linearly dependent.

You can also think of it in terms like the following: suppose you are working in an n-dimensional vector space, and you have a basis (x1,...,xn) consisting of n vectors. That means that if we have:

[tex]a_1x_1 + ... + a_nx_n = 0[/tex]

then necessarily

[tex]a_1 = ... = a_n = 0[/tex]

Now consider another vector x0 from this same vector space, and consider the linear combination:

[tex]a_0x_0 + a_1x_1 + ... + a_nx_n = 0[/tex]

Suppose [tex]a_0=0[/tex]. Then we must necessarily have [tex]a_1 = ... = a_n = 0[/tex], since the basis is linearly independent.

Suppose now that [tex]a_0 \neq 0[/tex]. Then we can rewrite this equation as:

[tex]x_0 = - \left(\frac{a_1}{a_0}x_1 + \frac{a_n}{a_0}x_n\right)[/tex]

If x0 is not the zero vector, it must therefore be a linear combination of the basis vectors.

In other words, the set of vectors x0 + the basis vectors is linearly dependent.
 
  • #10
129
0
I see now. Thank you so much to everyone :smile:
 

Related Threads on Linear Algebra Dimension Proof

  • Last Post
Replies
1
Views
1K
Replies
4
Views
1K
  • Last Post
Replies
2
Views
7K
Replies
6
Views
8K
  • Last Post
Replies
6
Views
1K
  • Last Post
Replies
15
Views
3K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
3
Views
3K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
4
Views
831
Top