Does a Zero Eigenvalue Imply a Non-Invertible Matrix?

  • Context: Undergrad 
  • Thread starter Thread starter aliaze1
  • Start date Start date
  • Tags Tags
    Eigenvalues Subspaces
Click For Summary

Discussion Overview

The discussion centers on whether a square matrix with a zero eigenvalue is necessarily non-invertible. Participants explore the implications of eigenvalues and eigenvectors, particularly in the context of matrix operations and definitions related to invertibility.

Discussion Character

  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants argue that if a matrix has a zero eigenvalue, it cannot be invertible, as this implies the existence of a non-zero vector that is mapped to zero by the matrix.
  • One participant references singular value decomposition, suggesting that a diagonal matrix in this context would contain a zero eigenvalue, leading to the conclusion that the original matrix is non-invertible.
  • Another participant elaborates on the definitions of eigenvalues and eigenvectors, indicating that the relationship A.v = λv must hold, and if λ = 0, this leads to contradictions regarding invertibility.
  • Some participants discuss the implications of matrix multiplication and the associative property, emphasizing that if Av = 0 for a non-zero vector v, then the matrix cannot be one-to-one, which is a requirement for invertibility.
  • There are repeated assertions that a zero eigenvalue indicates a lack of unique inverse, reinforcing the idea that the matrix is non-invertible.

Areas of Agreement / Disagreement

Participants generally agree that a zero eigenvalue implies non-invertibility, but the discussion includes various approaches and explanations that reflect differing levels of understanding and reasoning about the topic.

Contextual Notes

Some participants express uncertainty about the definitions and relationships involved, particularly regarding the implications of eigenvalues on matrix operations. There are also mentions of linear dependence and independence, but these points remain less developed in the discussion.

aliaze1
Messages
173
Reaction score
1
Given a square matrix, if an eigenvalue is zero, is the matrix invertible?

I am inclined to say it will not be invertible, since if one were to do singular value decomposition of a matrix, we would have a diagonal matrix as part of the decomposition, and this diagonal matrix would have 0 as an eigen value, and 1/0 is not allowed. Am I correct in my way of looking at it?Also, does anyone know a good way to check if a given set of vectors (assume we just know we have a set, not their values) is linearly dependent or linearly independent without a calculator?

Thanks
 
Physics news on Phys.org
If A has eigenvalue 0, then, by definition of "eigenvalue", an non-zero vector v such that Av= 0. If A were invertible, you would have A^{-1}Av= v= A^{-1}0= 0, a contradiction.
 
HallsofIvy said:
If A has eigenvalue 0, then, by definition of "eigenvalue", an non-zero vector v such that Av= 0. If A were invertible, you would have A^{-1}Av= v= A^{-1}0= 0, a contradiction.

Hmm. Just to make sure I follow:

So to define what "eigenvalue" and "eigenvector" are, given a square matrix A, the relationship A.v= λv can exist. In this relationship, "v"is an eigenvector of A, and "λ" is an eigenvalue corresponding to that eigenvector.

Taking the inverse of a matrix with nonzero eigenvectors, I should be able to do this:
A-1Av= v

So if I understand it correctly, if a matrix is invertible, the relationship A-1Av= v should always exist.

The definition of an eigenvalue λ is one that satisfies the relationship A.v= λv, and the inverse matrix must satisfy the relationship A-1Av= v.

This is where I am a bit fuzzy. Av is the result of A.v right? So when we do A-1Av= v, we are actually doing A-1X= v, where X = A.v = λv.

If this is the case, then if I have λ = 0, then X = A.v = λv = 0, and the relationship A-1X= v, which is equivalent to A-1A.v= v, which will not hold since X = A.v = λv = 0.

With X = A.v = λv = 0, we have A-1X= 0, or A-1Av= 0, or A-10= 0, not A-1X= v;
therefore a matrix with an eigenvalue of 0 cannot be invertible.

Sorry for this long, drawn out, and likely inefficient way of doing things... just trying to understand this stuff
 
Last edited:
aliaze1 said:
Given a square matrix, if an eigenvalue is zero, is the matrix invertible?

I am inclined to say it will not be invertible, since if one were to do singular value decomposition of a matrix, we would have a diagonal matrix as part of the decomposition, and this diagonal matrix would have 0 as an eigen value, and 1/0 is not allowed. Am I correct in my way of looking at it?Also, does anyone know a good way to check if a given set of vectors (assume we just know we have a set, not their values) is linearly dependent or linearly independent without a calculator?

Thanks

A zero eigen value means that there is a non zero subspace to the vector space that is mapped to zero. So zero does not have a unique inverse.
 
aliaze1 said:
Hmm. Just to make sure I follow:

So to define what "eigenvalue" and "eigenvector" are, given a square matrix A, the relationship A.v= λv can exist. In this relationship, "v"is an eigenvector of A, and "λ" is an eigenvalue corresponding to that eigenvector.

Taking the inverse of a matrix with nonzero eigenvectors, I should be able to do this:
A-1Av= v

So if I understand it correctly, if a matrix is invertible, the relationship A-1Av= v should always exist.
Yes, that's pretty much the definition of "inverse": A-1[/b]Av= v and AA-1v= v for any vector v.

The definition of an eigenvalue λ is one that satisfies the relationship A.v= λv
For some non-zero[/b vector, v. A0= 0 for any matrix so A0= λ0= 0 for any matrix and any λ

and the inverse matrix must satisfy the relationship A-1Av= v.

This is where I am a bit fuzzy. Av is the result of A.v right? So when we do A-1Av= v, we are actually doing A-1X= v, where X = A.v = λv.
Remember that matrix multiplication is associative: on the left, A^{-1}(Av)= (A^{-1}A)v= Iv= v while on the right A^{-1}(\lambda v)= \lambda A^{-1}v.

If this is the case, then if I have λ = 0, then X = A.v = λv = 0, and the relationship A-1X= v, which is equivalent to A-1A.v= v, which will not hold since X = A.v = λv = 0.
That's one way of looking at it- if Av= 0 (A has 0 as an eigenvaue) then A-1Av cannot be equal to v because v is not 0.

Another way of looking at it- if there exist a non-zero vector, v, such that Av= 0, then, because A0= 0, A is not "one to one" and so is not invertible- if Av= 0 and A0= 0, what is A-10 as lavinia says.

With X = A.v = λv = 0, we have A-1X= 0, or A-1Av= 0, or A-10= 0, not A-1X= v;
therefore a matrix with an eigenvalue of 0 cannot be invertible.

Sorry for this long, drawn out, and likely inefficient way of doing things... just trying to understand this stuff
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K