Subspaces and eigenvalues

  • Thread starter aliaze1
  • Start date
  • #1
174
0
Given a square matrix, if an eigenvalue is zero, is the matrix invertible?

I am inclined to say it will not be invertible, since if one were to do singular value decomposition of a matrix, we would have a diagonal matrix as part of the decomposition, and this diagonal matrix would have 0 as an eigen value, and 1/0 is not allowed. Am I correct in my way of looking at it?


Also, does anyone know a good way to check if a given set of vectors (assume we just know we have a set, not their values) is linearly dependent or linearly independent without a calculator?

Thanks
 

Answers and Replies

  • #2
HallsofIvy
Science Advisor
Homework Helper
41,833
956
If A has eigenvalue 0, then, by definition of "eigenvalue", an non-zero vector v such that Av= 0. If A were invertible, you would have [itex]A^{-1}Av= v= A^{-1}0= 0[/itex], a contradiction.
 
  • #3
174
0
If A has eigenvalue 0, then, by definition of "eigenvalue", an non-zero vector v such that Av= 0. If A were invertible, you would have [itex]A^{-1}Av= v= A^{-1}0= 0[/itex], a contradiction.
Hmm. Just to make sure I follow:

So to define what "eigenvalue" and "eigenvector" are, given a square matrix A, the relationship A.v= λv can exist. In this relationship, "v"is an eigenvector of A, and "λ" is an eigenvalue corresponding to that eigenvector.

Taking the inverse of a matrix with nonzero eigenvectors, I should be able to do this:
A-1Av= v

So if I understand it correctly, if a matrix is invertible, the relationship A-1Av= v should always exist.

The definition of an eigenvalue λ is one that satisfies the relationship A.v= λv, and the inverse matrix must satisfy the relationship A-1Av= v.

This is where I am a bit fuzzy. Av is the result of A.v right? So when we do A-1Av= v, we are actually doing A-1X= v, where X = A.v = λv.

If this is the case, then if I have λ = 0, then X = A.v = λv = 0, and the relationship A-1X= v, which is equivalent to A-1A.v= v, which will not hold since X = A.v = λv = 0.

With X = A.v = λv = 0, we have A-1X= 0, or A-1Av= 0, or A-10= 0, not A-1X= v;
therefore a matrix with an eigenvalue of 0 cannot be invertible.

Sorry for this long, drawn out, and likely inefficient way of doing things... just trying to understand this stuff
 
Last edited:
  • #4
lavinia
Science Advisor
Gold Member
3,236
624
Given a square matrix, if an eigenvalue is zero, is the matrix invertible?

I am inclined to say it will not be invertible, since if one were to do singular value decomposition of a matrix, we would have a diagonal matrix as part of the decomposition, and this diagonal matrix would have 0 as an eigen value, and 1/0 is not allowed. Am I correct in my way of looking at it?


Also, does anyone know a good way to check if a given set of vectors (assume we just know we have a set, not their values) is linearly dependent or linearly independent without a calculator?

Thanks
A zero eigen value means that there is a non zero subspace to the vector space that is mapped to zero. So zero does not have a unique inverse.
 
  • #5
HallsofIvy
Science Advisor
Homework Helper
41,833
956
Hmm. Just to make sure I follow:

So to define what "eigenvalue" and "eigenvector" are, given a square matrix A, the relationship A.v= λv can exist. In this relationship, "v"is an eigenvector of A, and "λ" is an eigenvalue corresponding to that eigenvector.

Taking the inverse of a matrix with nonzero eigenvectors, I should be able to do this:
A-1Av= v

So if I understand it correctly, if a matrix is invertible, the relationship A-1Av= v should always exist.
Yes, that's pretty much the definition of "inverse": A-1[/b]Av= v and AA-1v= v for any vector v.

The definition of an eigenvalue λ is one that satisfies the relationship A.v= λv
For some non-zero[/b vector, v. A0= 0 for any matrix so A0= λ0= 0 for any matrix and any λ

and the inverse matrix must satisfy the relationship A-1Av= v.

This is where I am a bit fuzzy. Av is the result of A.v right? So when we do A-1Av= v, we are actually doing A-1X= v, where X = A.v = λv.
Remember that matrix multiplication is associative: on the left, [itex]A^{-1}(Av)= (A^{-1}A)v= Iv= v[/itex] while on the right [itex]A^{-1}(\lambda v)= \lambda A^{-1}v[/itex].

If this is the case, then if I have λ = 0, then X = A.v = λv = 0, and the relationship A-1X= v, which is equivalent to A-1A.v= v, which will not hold since X = A.v = λv = 0.
That's one way of looking at it- if Av= 0 (A has 0 as an eigenvaue) then A-1Av cannot be equal to v because v is not 0.

Another way of looking at it- if there exist a non-zero vector, v, such that Av= 0, then, because A0= 0, A is not "one to one" and so is not invertible- if Av= 0 and A0= 0, what is A-10 as lavinia says.

With X = A.v = λv = 0, we have A-1X= 0, or A-1Av= 0, or A-10= 0, not A-1X= v;
therefore a matrix with an eigenvalue of 0 cannot be invertible.

Sorry for this long, drawn out, and likely inefficient way of doing things... just trying to understand this stuff
 

Related Threads on Subspaces and eigenvalues

  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
14
Views
2K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
3
Views
4K
  • Last Post
Replies
4
Views
3K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
5
Views
3K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
23
Views
7K
Top