# Eigenspace and lin dependency proof

1. Apr 6, 2005

### EvLer

I have these questions and cannot find a proof in the textbook:

when one finds eigenvectors of a matrix they form its eigenspace, i.e. they are lin indep, how is it proved?

And also a matrix is deficient, when one does not get "enough" eigenvectors to span R^n, so maybe I am wrong, but it seems that the technique of finding eigenvectors eliminates lin dependent eigenvectors if there were such. How so?

Thanks as usual!

2. Apr 6, 2005

### Davorak

$$u_1 X_1+u_2 X_2+u_3 X_3+...= u_1\begin{array}{| c |} x_{1,1}\\ x_{1,2}\\ x_{1,3}\\ .\\ .\\ .\\ \end{array} + u_2\begin{array}{| c |} x_{2,1}\\ x_{2,2}\\ x_{2,3}\\ .\\ .\\ .\\ \end{array} + u_3\begin{array}{| c |} x_{3,1}\\ x_{3,2}\\ x_{3,3}\\ .\\ .\\ .\\ \end{array}...= \begin{array}{| c c c c |} x_{1,1}&x_{2,1}&x_{3,1}&...\\ x_{1,2}&x_{2,2}&x_{3,2}&...\\ x_{1,3}&x_{2,3}&x_{3,3}&...\\ .&.&.&.\\ .&.&.&.\\ .&.&.&.\\ \end{array} \ \begin{array}{| c |} u_1\\ u_2\\ u_3\\ .\\ .\\ .\\ \end{array}= \begin{array}{| c |} 0\\ 0\\ 0\\ .\\ .\\ .\\ \end{array}$$
If this is only true when $u_1 =u_2=u_3=...=0$ then the vectors
$X_1,X_2,X_3...$ are linearly independent.

3. Apr 6, 2005

### Hurkyl

Staff Emeritus
Why do you think the eigenvectors are linearly independent?

I think you're thinking of an eigenbasis.

4. Apr 6, 2005

### hypermorphism

It is not inherent in the derivation at all. Note that although it is obvious in R^2 that two distinct eigenvectors are independent, is it immediately obvious in 3 dimensions that a third eigenvector is not a linear combination of the first two eigenvectors (and in higher dimensions) ?
Here is a proof from "Linear Algebra Done Right", a highly recommended text:
Let T be a linear transformation from V into U such that T has n distinct eigenvalues L1,...,Ln. Let v1,...,vn be eigenvectors corresponding to each respective eigenvalue. Assume that some vi in our list is the smallest vector that is a linear combination of the previous vectors. Then $v_i = \sum_{j=1}^{i-1} a_jv_j$, and $Tv_i = \sum_{j=1}^{i-1} a_jL_jv_j = L_iv_i$.
But $L_iv_i = \sum_{j=1}^{i-1} a_jL_iv_j$, which implies that $(\sum_{j=1}^{i-1} a_jL_jv_j) - \sum_{j=1}^{i-1} a_jL_iv_j = 0$ or $\sum_{j=1}^{i-1} a_j(L_j - L_i)v_j = 0$
Since the aj's are the only coefficients that can be zero, this implies vi is zero, which is a contradiction.

5. Apr 6, 2005

### EvLer

Yeah, you're right. It makes more sense now since Davorak pointed out dependency equation.
If they are lin dependent, then matrix is deficient, I guess?

6. Apr 7, 2005

### Hurkyl

Staff Emeritus
Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.

In general, if v is an eigenvector of A, then so is 2v, and {v, 2v} is clearly a linearly dependent set.

7. Apr 7, 2005

### EvLer

Ok, now I am confused....
Taking 2x2 identity matrix:
1 0
0 1
subtracting lambda and getting its determinant to be (1 - lambda)^2 = 0, lambda = 1. So then the modified matrix turns out to be 0 2x2, and how do you find eigenvectors? they cannot be zero, right?
 Oh, I think I got it [/edit]

Last edited: Apr 7, 2005
8. Apr 7, 2005

### hypermorphism

By distinct eigenvectors, I was shortening "eigenvectors for distinct eigenvalues", as I thought was the intent of the original poster.

9. Apr 7, 2005

### mathwonk

this result is proved from a slightly more sophisticated point of view on page 66 of sharipov's text and page 10 of mine, both free, cited in posts 25,26 of the thread "sharipov's linear algebra textbook", from 2/21/2005