# Linear Algebra Dimension Proof

## Homework Statement

Let $$V\subset\,R^n$$ be a subpace with $$dim(V) = D$$. Prove that any $$S$$ vectors in $$V$$ are linearly dependent if $$S > D$$.

## Homework Equations

Rank-nullity theorem?

## The Attempt at a Solution

$$dim(V) = D$$ implies that there are $$D$$ vectors in a basis for $$V$$. If $$S > D$$ then there will be at least one more vector in the set, which will result in a free variable and thus, infinite solutions; implying that the set will be linearly dependent.

For the sake of illustration, suppose we are working in $$V = R^2, D=2$$, and suppose you choose $$S=3$$ vectors. You desire to prove that these three vectors must be linearly dependent in $$R^2$$.

Form a linear combination of these vectors. You desire to verify whether or not

$$a_1v_1 + a_2v_2 + a_3v_3 = 0$$

holds true when not all $$a_i = 0$$.

Form a matrix where the columns of the matrix are your S vectors from V. The matrix will then have n rows and s columns. The linear combination above can then be expressed (in the case of our specific example) something like:

$$\left(\begin{array}{ccc}v_{11} & v_{12} & v_{13} \\ v_{21} & v_{22} & v_{23} \end{array}\right) \cdot \left(\begin{array}{c}a_1 \\ a_2 \\ a_3 \end{array}\right) = 0$$

where

$$v_i = v_{1i}\hat{e_1} + v_{2i}\hat{e_2}$$

for some basis consisting of D vectors.

You can easily generalize this to a matrix with n rows and s columns.

The question arises as to what values of v, a does the above linear system have solutions for?

Clearly, if $$a_1=a_2=a_3=0$$, the system above holds true.

The question is, does it hold true when not all $$a_i = 0$$?

If so, then the three vectors must be linearly dependent, which is what you desire to prove.

The direction you were heading in w/ the rank of a matrix, is probably the right direction to go..

Thank you. So say $$V = R^n, D=n$$ and $$S > n$$. I'm trying to determine if there exists a non-trivial solution to the augmented matrix:

$$\left[ {\begin{array}{cccc} v_{11} & v_{12} & ... & v_{1S} \\ ... & ... & ... & ... \\ v_{n1} & v_{n2} & ... & v_{nS} \\ \end{array} } \right| \left| {\begin{array}{c} 0_{1} \\ ... \\ 0_{n} \end{array} } \right]$$

Which will row-reduce to:

$$\left[ {\begin{array}{cccc} 1 & 0 & ... & x \\ 0 & 1 & ... & y \\ ... & ... & ... & ... \\ 0 & 0 & 1 & z \\ \end{array} } \right| \left| {\begin{array}{c} 0_{1} \\ ... \\ ... \\ 0_{n} \end{array} } \right]$$

Where $$[x, y ... z]$$ is some non-zero vector that will arise since $$S > n$$. Then the solution set will be

$$[a_{1}, a_{2} ... a_{S}] = [x, y ... z]$$

And the set will be linearly dependent as a result. Or is that just what I had originally?

given two vector-(sub)spaces $$V \subseteq W$$ then $$\dim(V) \le \dim(W)$$. You can use this fact to derive a contradiction.

Last edited:
I wouldn't even mess around w/ an augmented matrix, or trying to actually, "physically" reduce the matrix into some kind of row-echelon form.

Just consider the linear system:

$$A \cdot x = 0$$

A is a k-by-n matrix.. that is, it has k rows and n columns.

x is an n-dimensional vector.

0 is the k-dimensional 0-vector.

This equation is a statement on the linear dependence of the vectors which make up the columns of the matrix A.

If the only "solution" to this equation is the "trivial" solution x=0, then by definition the vectors which make up the columns of A are linearly independent.

If there exist "non-trivial" solutions to this linear system, where $$x \neq 0$$, then by definition the vetors which make up the columns of A are linearly dependent.

If your problem statement, you have a vector space where dim(V)=k, and you are selecting n vectors from this space where n > k. You want to show that these n vectors must be linearly dependent. You can model your problem in the above system since dim(V)=k, and so every vector v in V can be expressed as a linearly combination of k basis vectors. Choose any basis of k linearly independent vectors, and construct the matrix A as above.

You will have, by construction of your problem, a linear system where k < n.

By construction, you know that rank(A) <= k < n.

If you have a matrix with n columns and k rows, and you know that k < n, i.e., that the rank of the matrix is < n, what you can conclude about the linear dependence of the columns making up the matrix?

I don't want to give away the *whole* answer, but it's a standard result from linear algebra.. probably in your textbook somewhere.

Shilov treats it at the start of Chapter 3 in his classic on linear algebra.

given two vector-(sub)spaces $$V \subseteq W$$ then $$\dim(V) \le \dim(W)$$. You can use this fact to derive a contradiction.

I just realised you might need the statement from the first post to prove this claim. Not sure tough

If you have a matrix with n columns and k rows, and you know that k < n, i.e., that the rank of the matrix is < n, what you can conclude about the linear dependence of the columns making up the matrix?

I can conclude that the columns must be linearly dependent. I know that the rank is the number of linearly independent columns, so if I've got k of those but k < n columns in total, the leftover column(s) will have to be a non-trivial linear combination of the others. That then implies that the x vector in Ax = 0 will be non-trivial as well, right?

I'm still not sure how that's different from just making an augmented matrix though (sorry; I'm useless at Linear Algebra).

This is merely a question of knowing basic vocabulary and definitions.

Proof: Given any basis set {v1, v2, .... vD} in D-Dimensional space, the set is both linearly independent and "spans" the vector space. Now if you introduce a new vector, say vK, then Span({v1,v2, ... , vD, vK}) \subseteq \!\, Span({v1,v2, .... , vD}), where this relation follows because the second set spans the full space of V. Now because it spans the space we have that any vector can be written as a linear combination of D number of vectors:
a1v1 + a2v2 + .... a(D-1)v(D-1) + aDvD = bvK (****): a1,....aD, and b are scalars under the number field. Notice that in this case b = 1.
Now take that extra vector and concatenate it to the set to construc the set under question....
{v1,v2, ... , vD, vK}
Now is it true that c1v1 + c2v2 + .... + cDvD + c(D+1)vK = 0 such that c1 = c2 = .... = cD = c(D+1) = 0 for all such scalars c1,c2,...cD,c(D+1)?
NO! Look at (****) if you don't immediately see it now.
QED.

Last edited:
I can conclude that the columns must be linearly dependent. I know that the rank is the number of linearly independent columns, so if I've got k of those but k < n columns in total, the leftover column(s) will have to be a non-trivial linear combination of the others. That then implies that the x vector in Ax = 0 will be non-trivial as well, right?
Yes, essentially.

You mentioned in the OP something about a "rank-nullity" theorem.

I'm not sure exactly what you meant about "rank-nullity" theorem, but yes, the rank of a matrix is the number of linearly independent columns (or rows), and so if the number of rows is k, then you must have rank(A) <= k.. and if you know k < n (strictly less than), where n is the number of columns, then you can conclude that the columns of the matrix must be linearly dependent.

You can also think of it in terms like the following: suppose you are working in an n-dimensional vector space, and you have a basis (x1,...,xn) consisting of n vectors. That means that if we have:

$$a_1x_1 + ... + a_nx_n = 0$$

then necessarily

$$a_1 = ... = a_n = 0$$

Now consider another vector x0 from this same vector space, and consider the linear combination:

$$a_0x_0 + a_1x_1 + ... + a_nx_n = 0$$

Suppose $$a_0=0$$. Then we must necessarily have $$a_1 = ... = a_n = 0$$, since the basis is linearly independent.

Suppose now that $$a_0 \neq 0$$. Then we can rewrite this equation as:

$$x_0 = - \left(\frac{a_1}{a_0}x_1 + \frac{a_n}{a_0}x_n\right)$$

If x0 is not the zero vector, it must therefore be a linear combination of the basis vectors.

In other words, the set of vectors x0 + the basis vectors is linearly dependent.

I see now. Thank you so much to everyone