Subspaces and eigenvalues

In summary: I don't think it's the only way. Given a set of vectors, you can check if they are linearly dependent or independent by calculating the dot product of each vector with the null vector (i.e. the vector that has all its components set to 0). If the dot product is zero, then the vectors are linearly dependent; if it's nonzero, then they are linearly independent.
  • #1
aliaze1
174
1
Given a square matrix, if an eigenvalue is zero, is the matrix invertible?

I am inclined to say it will not be invertible, since if one were to do singular value decomposition of a matrix, we would have a diagonal matrix as part of the decomposition, and this diagonal matrix would have 0 as an eigen value, and 1/0 is not allowed. Am I correct in my way of looking at it?Also, does anyone know a good way to check if a given set of vectors (assume we just know we have a set, not their values) is linearly dependent or linearly independent without a calculator?

Thanks
 
Physics news on Phys.org
  • #2
If A has eigenvalue 0, then, by definition of "eigenvalue", an non-zero vector v such that Av= 0. If A were invertible, you would have [itex]A^{-1}Av= v= A^{-1}0= 0[/itex], a contradiction.
 
  • #3
HallsofIvy said:
If A has eigenvalue 0, then, by definition of "eigenvalue", an non-zero vector v such that Av= 0. If A were invertible, you would have [itex]A^{-1}Av= v= A^{-1}0= 0[/itex], a contradiction.

Hmm. Just to make sure I follow:

So to define what "eigenvalue" and "eigenvector" are, given a square matrix A, the relationship A.v= λv can exist. In this relationship, "v"is an eigenvector of A, and "λ" is an eigenvalue corresponding to that eigenvector.

Taking the inverse of a matrix with nonzero eigenvectors, I should be able to do this:
A-1Av= v

So if I understand it correctly, if a matrix is invertible, the relationship A-1Av= v should always exist.

The definition of an eigenvalue λ is one that satisfies the relationship A.v= λv, and the inverse matrix must satisfy the relationship A-1Av= v.

This is where I am a bit fuzzy. Av is the result of A.v right? So when we do A-1Av= v, we are actually doing A-1X= v, where X = A.v = λv.

If this is the case, then if I have λ = 0, then X = A.v = λv = 0, and the relationship A-1X= v, which is equivalent to A-1A.v= v, which will not hold since X = A.v = λv = 0.

With X = A.v = λv = 0, we have A-1X= 0, or A-1Av= 0, or A-10= 0, not A-1X= v;
therefore a matrix with an eigenvalue of 0 cannot be invertible.

Sorry for this long, drawn out, and likely inefficient way of doing things... just trying to understand this stuff
 
Last edited:
  • #4
aliaze1 said:
Given a square matrix, if an eigenvalue is zero, is the matrix invertible?

I am inclined to say it will not be invertible, since if one were to do singular value decomposition of a matrix, we would have a diagonal matrix as part of the decomposition, and this diagonal matrix would have 0 as an eigen value, and 1/0 is not allowed. Am I correct in my way of looking at it?Also, does anyone know a good way to check if a given set of vectors (assume we just know we have a set, not their values) is linearly dependent or linearly independent without a calculator?

Thanks

A zero eigen value means that there is a non zero subspace to the vector space that is mapped to zero. So zero does not have a unique inverse.
 
  • #5
aliaze1 said:
Hmm. Just to make sure I follow:

So to define what "eigenvalue" and "eigenvector" are, given a square matrix A, the relationship A.v= λv can exist. In this relationship, "v"is an eigenvector of A, and "λ" is an eigenvalue corresponding to that eigenvector.

Taking the inverse of a matrix with nonzero eigenvectors, I should be able to do this:
A-1Av= v

So if I understand it correctly, if a matrix is invertible, the relationship A-1Av= v should always exist.
Yes, that's pretty much the definition of "inverse": A-1[/b]Av= v and AA-1v= v for any vector v.

The definition of an eigenvalue λ is one that satisfies the relationship A.v= λv
For some non-zero[/b vector, v. A0= 0 for any matrix so A0= λ0= 0 for any matrix and any λ

and the inverse matrix must satisfy the relationship A-1Av= v.

This is where I am a bit fuzzy. Av is the result of A.v right? So when we do A-1Av= v, we are actually doing A-1X= v, where X = A.v = λv.
Remember that matrix multiplication is associative: on the left, [itex]A^{-1}(Av)= (A^{-1}A)v= Iv= v[/itex] while on the right [itex]A^{-1}(\lambda v)= \lambda A^{-1}v[/itex].

If this is the case, then if I have λ = 0, then X = A.v = λv = 0, and the relationship A-1X= v, which is equivalent to A-1A.v= v, which will not hold since X = A.v = λv = 0.
That's one way of looking at it- if Av= 0 (A has 0 as an eigenvaue) then A-1Av cannot be equal to v because v is not 0.

Another way of looking at it- if there exist a non-zero vector, v, such that Av= 0, then, because A0= 0, A is not "one to one" and so is not invertible- if Av= 0 and A0= 0, what is A-10 as lavinia says.

With X = A.v = λv = 0, we have A-1X= 0, or A-1Av= 0, or A-10= 0, not A-1X= v;
therefore a matrix with an eigenvalue of 0 cannot be invertible.

Sorry for this long, drawn out, and likely inefficient way of doing things... just trying to understand this stuff
 

1. What is a subspace in linear algebra?

A subspace in linear algebra is a subset of a vector space that satisfies all the properties of a vector space. This means that it is closed under vector addition and scalar multiplication, and contains the zero vector. Subspaces can be thought of as smaller vector spaces that exist within a larger vector space.

2. How do you determine if a set of vectors forms a basis for a subspace?

In order for a set of vectors to form a basis for a subspace, they must be linearly independent and span the entire subspace. This means that no vector in the set can be written as a linear combination of the other vectors, and every vector in the subspace can be written as a linear combination of the basis vectors.

3. What is an eigenvalue and eigenvector?

An eigenvalue is a scalar value that represents how a linear transformation affects a particular vector. An eigenvector is a non-zero vector that, when multiplied by the linear transformation, results in a vector parallel to the original eigenvector, with the eigenvalue as its scale factor.

4. How do you find the eigenvalues and eigenvectors of a matrix?

To find the eigenvalues and eigenvectors of a matrix, you first need to find the characteristic polynomial of the matrix. This polynomial will have the eigenvalues as its roots. Then, you can use these eigenvalues to find the corresponding eigenvectors by solving the characteristic equation for each eigenvalue.

5. What is the significance of eigenvalues and eigenvectors in linear algebra?

Eigenvalues and eigenvectors are important in linear algebra because they provide insight into the behavior of linear transformations and matrices. They allow us to simplify complex calculations and understand the geometric properties of a given transformation. They are also used in a variety of applications, such as data analysis, image processing, and machine learning.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
757
  • Linear and Abstract Algebra
Replies
2
Views
494
  • Linear and Abstract Algebra
Replies
8
Views
772
  • Linear and Abstract Algebra
Replies
1
Views
766
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
490
  • Linear and Abstract Algebra
Replies
3
Views
964
  • Linear and Abstract Algebra
Replies
2
Views
351
Replies
4
Views
2K
Replies
3
Views
2K
Back
Top