Eigenspace and lin dependency proof

  • Thread starter EvLer
  • Start date
  • Tags
    Proof
In summary, a matrix is deficient when one does not get "enough" eigenvectors to span R^n. The technique of finding eigenvectors eliminates lin dependent eigenvectors.
  • #1
EvLer
458
0
I have these questions and cannot find a proof in the textbook:

when one finds eigenvectors of a matrix they form its eigenspace, i.e. they are lin indep, how is it proved?

And also a matrix is deficient, when one does not get "enough" eigenvectors to span R^n, so maybe I am wrong, but it seems that the technique of finding eigenvectors eliminates lin dependent eigenvectors if there were such. How so?

Thanks as usual!
 
Physics news on Phys.org
  • #2
[tex]
u_1 X_1+u_2 X_2+u_3 X_3+...=
u_1\begin{array}{| c |}
x_{1,1}\\
x_{1,2}\\
x_{1,3}\\
.\\
.\\
.\\
\end{array} +
u_2\begin{array}{| c |}
x_{2,1}\\
x_{2,2}\\
x_{2,3}\\
.\\
.\\
.\\
\end{array} +
u_3\begin{array}{| c |}
x_{3,1}\\
x_{3,2}\\
x_{3,3}\\
.\\
.\\
.\\
\end{array}...=
\begin{array}{| c c c c |}
x_{1,1}&x_{2,1}&x_{3,1}&...\\
x_{1,2}&x_{2,2}&x_{3,2}&...\\
x_{1,3}&x_{2,3}&x_{3,3}&...\\
.&.&.&.\\
.&.&.&.\\
.&.&.&.\\
\end{array}
\
\begin{array}{| c |}
u_1\\
u_2\\
u_3\\
.\\
.\\
.\\
\end{array}=
\begin{array}{| c |}
0\\
0\\
0\\
.\\
.\\
.\\
\end{array}
[/tex]
If this is only true when [itex]u_1 =u_2=u_3=...=0[/itex] then the vectors
[itex] X_1,X_2,X_3...[/itex] are linearly independent.
 
  • #3
when one finds eigenvectors of a matrix they form its eigenspace, i.e. they are lin indep, how is it proved?

Why do you think the eigenvectors are linearly independent?

I think you're thinking of an eigenbasis.
 
  • #4
It is not inherent in the derivation at all. Note that although it is obvious in R^2 that two distinct eigenvectors are independent, is it immediately obvious in 3 dimensions that a third eigenvector is not a linear combination of the first two eigenvectors (and in higher dimensions) ?
Here is a proof from "Linear Algebra Done Right", a highly recommended text:
Let T be a linear transformation from V into U such that T has n distinct eigenvalues L1,...,Ln. Let v1,...,vn be eigenvectors corresponding to each respective eigenvalue. Assume that some vi in our list is the smallest vector that is a linear combination of the previous vectors. Then [itex]v_i = \sum_{j=1}^{i-1} a_jv_j[/itex], and [itex]Tv_i = \sum_{j=1}^{i-1} a_jL_jv_j = L_iv_i[/itex].
But [itex]L_iv_i = \sum_{j=1}^{i-1} a_jL_iv_j[/itex], which implies that [itex](\sum_{j=1}^{i-1} a_jL_jv_j) - \sum_{j=1}^{i-1} a_jL_iv_j = 0[/itex] or [itex]\sum_{j=1}^{i-1} a_j(L_j - L_i)v_j = 0[/itex]
Since the aj's are the only coefficients that can be zero, this implies vi is zero, which is a contradiction.
 
  • #5
Hurkyl said:
Why do you think the eigenvectors are linearly independent?

I think you're thinking of an eigenbasis.

Yeah, you're right. It makes more sense now since Davorak pointed out dependency equation.
If they are lin dependent, then matrix is deficient, I guess?
 
  • #6
It is not inherent in the derivation at all. Note that although it is obvious in R^2 that two distinct eigenvectors are independent

Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.

In general, if v is an eigenvector of A, then so is 2v, and {v, 2v} is clearly a linearly dependent set.
 
  • #7
Hurkyl said:
Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.

In general, if v is an eigenvector of A, then so is 2v, and {v, 2v} is clearly a linearly dependent set.
Ok, now I am confused...
Taking 2x2 identity matrix:
1 0
0 1
subtracting lambda and getting its determinant to be (1 - lambda)^2 = 0, lambda = 1. So then the modified matrix turns out to be 0 2x2, and how do you find eigenvectors? they cannot be zero, right?
[edit] Oh, I think I got it [/edit]
 
Last edited:
  • #8
Hurkyl said:
Not true.

For example, [1, 0] and [2, 0] are both eigenvectors of the identity matrix, and are clearly linearly dependent.
By distinct eigenvectors, I was shortening "eigenvectors for distinct eigenvalues", as I thought was the intent of the original poster. :smile:
 
  • #9
this result is proved from a slightly more sophisticated point of view on page 66 of sharipov's text and page 10 of mine, both free, cited in posts 25,26 of the thread "sharipov's linear algebra textbook", from 2/21/2005
 

1. What is an eigenspace?

An eigenspace is a vector space that consists of all the eigenvectors associated with a specific eigenvalue of a linear transformation.

2. How do you find the eigenspace of a matrix?

To find the eigenspace of a matrix, you first need to find the eigenvalues of the matrix by solving the characteristic equation. Then, for each eigenvalue, you can find the corresponding eigenvectors by solving the homogeneous system of equations (A-λI)x=0, where A is the matrix and λ is the eigenvalue.

3. What is a linear dependency proof?

A linear dependency proof is a mathematical method used to determine if a set of vectors is linearly dependent or linearly independent. This is important in linear algebra as linearly independent vectors can span a vector space, while linearly dependent vectors cannot.

4. How do you prove linear dependency of a set of vectors?

To prove linear dependency of a set of vectors, you can use the definition of linear dependence, which states that a set of vectors is linearly dependent if and only if at least one vector in the set can be written as a linear combination of the other vectors. This can be shown by setting up a system of equations and solving for the coefficients of the linear combination.

5. Can a set of linearly dependent vectors form a basis for a vector space?

No, a set of linearly dependent vectors cannot form a basis for a vector space. This is because a basis must be a linearly independent set of vectors that can span the entire vector space. If a set of vectors is linearly dependent, it means that at least one vector in the set can be written as a linear combination of the other vectors, making it redundant and not necessary for spanning the vector space.

Similar threads

Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
600
  • Engineering and Comp Sci Homework Help
Replies
18
Views
2K
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
4K
  • Linear and Abstract Algebra
Replies
10
Views
3K
  • Linear and Abstract Algebra
Replies
4
Views
3K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top