Eigenspace and lin dependency proof

  • Thread starter Thread starter EvLer
  • Start date Start date
  • Tags Tags
    Proof
EvLer
Messages
454
Reaction score
0
I have these questions and cannot find a proof in the textbook:

when one finds eigenvectors of a matrix they form its eigenspace, i.e. they are lin indep, how is it proved?

And also a matrix is deficient, when one does not get "enough" eigenvectors to span R^n, so maybe I am wrong, but it seems that the technique of finding eigenvectors eliminates lin dependent eigenvectors if there were such. How so?

Thanks as usual!
 
Physics news on Phys.org
<br /> u_1 X_1+u_2 X_2+u_3 X_3+...=<br /> u_1\begin{array}{| c |} <br /> x_{1,1}\\<br /> x_{1,2}\\<br /> x_{1,3}\\<br /> .\\<br /> .\\<br /> .\\<br /> \end{array} +<br /> u_2\begin{array}{| c |} <br /> x_{2,1}\\<br /> x_{2,2}\\<br /> x_{2,3}\\<br /> .\\<br /> .\\<br /> .\\<br /> \end{array} +<br /> u_3\begin{array}{| c |} <br /> x_{3,1}\\<br /> x_{3,2}\\<br /> x_{3,3}\\<br /> .\\<br /> .\\<br /> .\\<br /> \end{array}...=<br /> \begin{array}{| c c c c |} <br /> x_{1,1}&amp;x_{2,1}&amp;x_{3,1}&amp;...\\<br /> x_{1,2}&amp;x_{2,2}&amp;x_{3,2}&amp;...\\<br /> x_{1,3}&amp;x_{2,3}&amp;x_{3,3}&amp;...\\<br /> .&amp;.&amp;.&amp;.\\<br /> .&amp;.&amp;.&amp;.\\<br /> .&amp;.&amp;.&amp;.\\<br /> \end{array}<br /> \ <br /> \begin{array}{| c |} <br /> u_1\\<br /> u_2\\<br /> u_3\\<br /> .\\<br /> .\\<br /> .\\<br /> \end{array}=<br /> \begin{array}{| c |} <br /> 0\\<br /> 0\\<br /> 0\\<br /> .\\<br /> .\\<br /> .\\<br /> \end{array}<br />
If this is only true when u_1 =u_2=u_3=...=0 then the vectors
X_1,X_2,X_3... are linearly independent.
 
when one finds eigenvectors of a matrix they form its eigenspace, i.e. they are lin indep, how is it proved?

Why do you think the eigenvectors are linearly independent?

I think you're thinking of an eigenbasis.
 
It is not inherent in the derivation at all. Note that although it is obvious in R^2 that two distinct eigenvectors are independent, is it immediately obvious in 3 dimensions that a third eigenvector is not a linear combination of the first two eigenvectors (and in higher dimensions) ?
Here is a proof from "Linear Algebra Done Right", a highly recommended text:
Let T be a linear transformation from V into U such that T has n distinct eigenvalues L1,...,Ln. Let v1,...,vn be eigenvectors corresponding to each respective eigenvalue. Assume that some vi in our list is the smallest vector that is a linear combination of the previous vectors. Then v_i = \sum_{j=1}^{i-1} a_jv_j, and Tv_i = \sum_{j=1}^{i-1} a_jL_jv_j = L_iv_i.
But L_iv_i = \sum_{j=1}^{i-1} a_jL_iv_j, which implies that (\sum_{j=1}^{i-1} a_jL_jv_j) - \sum_{j=1}^{i-1} a_jL_iv_j = 0 or \sum_{j=1}^{i-1} a_j(L_j - L_i)v_j = 0
Since the aj's are the only coefficients that can be zero, this implies vi is zero, which is a contradiction.
 
Hurkyl said:
Why do you think the eigenvectors are linearly independent?

I think you're thinking of an eigenbasis.

Yeah, you're right. It makes more sense now since Davorak pointed out dependency equation.
If they are lin dependent, then matrix is deficient, I guess?
 
It is not inherent in the derivation at all. Note that although it is obvious in R^2 that two distinct eigenvectors are independent

Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.

In general, if v is an eigenvector of A, then so is 2v, and {v, 2v} is clearly a linearly dependent set.
 
Hurkyl said:
Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.

In general, if v is an eigenvector of A, then so is 2v, and {v, 2v} is clearly a linearly dependent set.
Ok, now I am confused...
Taking 2x2 identity matrix:
1 0
0 1
subtracting lambda and getting its determinant to be (1 - lambda)^2 = 0, lambda = 1. So then the modified matrix turns out to be 0 2x2, and how do you find eigenvectors? they cannot be zero, right?
[edit] Oh, I think I got it [/edit]
 
Last edited:
Hurkyl said:
Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.
By distinct eigenvectors, I was shortening "eigenvectors for distinct eigenvalues", as I thought was the intent of the original poster. :smile:
 
this result is proved from a slightly more sophisticated point of view on page 66 of sharipov's text and page 10 of mine, both free, cited in posts 25,26 of the thread "sharipov's linear algebra textbook", from 2/21/2005
 
Back
Top