Prove that a square matrix is not invertible iff 0 is an eigenvalue of A

In summary, you can prove that a square matrix is not invertible if and only if 0 is an eigenvalue of the matrix.
  • #1
zeion
466
1

Homework Statement



Prove that a square matrix is not invertible if and only if 0 is an eigenvalue of A.

Homework Equations


The Attempt at a Solution



Given:
[tex]

A\vec{x} = \lambda\vec{x} \Rightarrow

A\vec{x} - \lambda\vec{x} = \vec{0} \Rightarrow

(A - \lambda I)\vec{x} = \vec{0}

[/tex]
By definition x not = 0,
If [tex] \lambda = 0 \Rightarrow A\vec{x} = \vec{0}[/tex]

Since x not = 0, A is not linearly independent therefore not invertible.I suck at doing proves. Do I need to show it with general arbitrary variables..?
 
Physics news on Phys.org
  • #2
zeion said:

Homework Statement



Prove that a square matrix is not invertible if and only if 0 is an eigenvalue of A.


Homework Equations





The Attempt at a Solution



Given:
[tex]

A\vec{x} = \lambda\vec{x} \Rightarrow

A\vec{x} - \lambda\vec{x} = \vec{0} \Rightarrow

(A - \lambda I)\vec{x} = \vec{0}

[/tex]
By definition x not = 0,
If [tex] \lambda = 0 \Rightarrow A\vec{x} = \vec{0}[/tex]

Since x not = 0, A is not linearly independent therefore not invertible.
Now you are "waving your arms." What does it mean to say that a single matrix is linearly independent?

You are given that 0 is an eigenvalue, so it must be that Ax = 0, for x != 0. What about |Ax|? Can you do something with that?
zeion said:
I suck at doing proves. Do I need to show it with general arbitrary variables..?

You get better at doing proofs (not proves) by doing proofs. Prove is a verb; proof is a noun.

BTW, problems like this really should go into the Calculus & Beyond section.
 
  • #3
I'm not sure I follow what you mean.. maybe I can say something about the inverse of the matrix..? Like if the determinant of the matrix is 0 then it is not invertible?
 
  • #5
Okay so since the columns of the matrix are not linearly independent (because the nullspace of the matrix does not only contain the zero vector), there will be a zero column and therefore the determinant of the matrix will be 0, therefore the matrix is not invertible?
 
  • #6
You can do it much more simply. Since Ax = 0x, then (A - 0)x = 0. What does that say about the determinant of A - 0? Here 0 represents the nxn zero matrix.
 
  • #7
The determinant of (A - 0) is 0 since it is not linearly independent..?
Or can I just say it is 0 because x is not 0?
 
  • #8
I don't think you understand what linear independence means. You have reverted back to what you said in post 1.

Here is a matrix, A:
[0 1]
[0 0]
Here is a vector x:
[1]
[0]
Clearly A is not the zero matrix, and x is not the zero vector, yet Ax = 0, right?

Would you describe A as linearly dependent, linearly independent, or neither?
 
  • #9
Can't I just look at the columns of the matrix and see if they can be written as some linear combination of the others..? So A is linearly dependent since 0(1,0) = (0,0)?
 
  • #10
What if you're not given any details about the columns of the matrix?

You are misusing or misunderstanding the concept of linear independence/dependence. You can describe the columns of a matrix as being linearly dependent or linearly independent, but you wouldn't describe matrices this way unless you're treating them as vectors.

Using the definition of linear independence, the matrix as I have defined it is linearly independent. The equation cA = 0 has only a single solution, namely c = 0.

What I have been trying to get you to realize is that |A - 0| = 0. That says something important about A, the nullspace of A, the rows of A, the columns of A, and also about the invertibility (or not) of A. Haven't you seen any theorem about the determinant of a square matrix and its invertibility?
 
  • #11
I'm not exactly sure but I remember finding the inverse of a matrix involves multiplying it by 1/ the determinant of the matrix, so if the determinant of the matrix is 0 then its inverse is not defined..?
 
  • #12
Just a note - remember that the determinant of a square matrix equals the product of its eigenvalues.
 
  • #13
IIRC, that's a technique for inverting 2 x 2 matrices. I don't think it applies to larger matrices.

Let's try another tack: You have Ax = 0 for nonzero x. What do you know about the nullspace of A?
 
  • #14
Umm for nonzero x the nullspace of A has more than just the zero vector so it has dimension more than 0?
 
  • #15
Yes, so what does that say about the invertibility of A?
 
  • #16
If dimension of nullspace > 0 that means there is at least one parameter, so the rank of A not = the dimension of A, and so A is not invertible..?
 
  • #17
Or in short, if dim(null(A)) > 0, then A is not invertible.

Going back to the OP, you have established that for an n X n matrix A, if 0 is an eigenvalue of A, then A is not invertible.

Now go the other way to show that A being non-invertible implies that 0 is an eigenvalue of A.
 
  • #18
Okay.. not sure how to do this haha

Given A not invertible then dim(null(A)) > 0, and

[tex]

A = (A - 0) = (A - \lambda I) [/tex] iff [tex]
\lambda = 0

[/tex]
 
  • #19
You need to flesh this out a bit. How does it use the fact that A is not invertible?
 
  • #20
Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues

What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?
 
  • #21
statdad said:
Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues
I'm not sure that the OP knows this.
statdad said:
What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?
 
  • #22
If A is not invertible then there is at least one zero in the diagonal and so unless lambda was zero, A would no longer be not invertible after subtracting lambda I.
 
  • #23
That's not true. Here's a counterexample for A:
[1 1]
[1 1]

It's not invertible, but there are no zeroes along the main diagonal.
 
  • #24
statdad said:
Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues

What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?

Does this have something to do with that thing where if I use the eigenvectors as the basis for a transformation matrix of a linear operator then the matrix is just a diagonal matrix with the corresponding eigenvalues along the diagonal?
 

1. Can a square matrix be invertible if 0 is an eigenvalue?

No, a square matrix is not invertible if 0 is an eigenvalue. This is because the inverse of a matrix only exists if all of its eigenvalues are non-zero.

2. How can we prove that a square matrix is not invertible if 0 is an eigenvalue?

We can prove this by contradiction. Assume that the matrix is invertible and has 0 as an eigenvalue. This means there exists a non-zero vector x such that Ax = 0. However, since the matrix is invertible, we can multiply both sides of this equation by A^-1. This results in A^-1Ax = A^-1*0, which simplifies to x = 0. This contradicts our assumption that x is a non-zero vector, therefore, our initial assumption is false and the matrix is not invertible.

3. What is the relationship between eigenvalues and invertibility of a matrix?

The eigenvalues of a matrix are crucial in determining its invertibility. If all the eigenvalues are non-zero, then the matrix is invertible. However, if any of the eigenvalues are 0, then the matrix is not invertible.

4. Can a square matrix have multiple eigenvalues of 0?

Yes, a square matrix can have multiple eigenvalues of 0. In fact, the number of 0 eigenvalues is equal to the dimension of the null space of the matrix.

5. What is the significance of 0 being an eigenvalue of a matrix?

0 being an eigenvalue of a matrix signifies that the matrix is singular, meaning it is not invertible. This can have important implications in applications such as solving systems of linear equations or finding the inverse of a matrix.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
14
Views
2K
  • Precalculus Mathematics Homework Help
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
16
Views
1K
  • Precalculus Mathematics Homework Help
2
Replies
57
Views
3K
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Precalculus Mathematics Homework Help
Replies
9
Views
2K
  • Precalculus Mathematics Homework Help
Replies
3
Views
779
  • Precalculus Mathematics Homework Help
Replies
5
Views
973
  • Calculus and Beyond Homework Help
Replies
2
Views
386
Back
Top