Prove that a square matrix is not invertible iff 0 is an eigenvalue of A

Click For Summary
A square matrix is not invertible if and only if 0 is an eigenvalue of the matrix. This is established by showing that if 0 is an eigenvalue, then there exists a non-zero vector x such that Ax = 0, indicating that the matrix is not linearly independent and thus not invertible. Conversely, if the matrix is not invertible, the dimension of its nullspace is greater than zero, implying that 0 must be an eigenvalue. The determinant of the matrix, which is the product of its eigenvalues, must equal zero for the matrix to be non-invertible. Understanding the relationship between eigenvalues, determinants, and linear independence is crucial in proving this statement.
zeion
Messages
455
Reaction score
1

Homework Statement



Prove that a square matrix is not invertible if and only if 0 is an eigenvalue of A.

Homework Equations


The Attempt at a Solution



Given:
<br /> <br /> A\vec{x} = \lambda\vec{x} \Rightarrow <br /> <br /> A\vec{x} - \lambda\vec{x} = \vec{0} \Rightarrow <br /> <br /> (A - \lambda I)\vec{x} = \vec{0}<br /> <br />
By definition x not = 0,
If \lambda = 0 \Rightarrow A\vec{x} = \vec{0}

Since x not = 0, A is not linearly independent therefore not invertible.I suck at doing proves. Do I need to show it with general arbitrary variables..?
 
Physics news on Phys.org
zeion said:

Homework Statement



Prove that a square matrix is not invertible if and only if 0 is an eigenvalue of A.


Homework Equations





The Attempt at a Solution



Given:
<br /> <br /> A\vec{x} = \lambda\vec{x} \Rightarrow <br /> <br /> A\vec{x} - \lambda\vec{x} = \vec{0} \Rightarrow <br /> <br /> (A - \lambda I)\vec{x} = \vec{0}<br /> <br />
By definition x not = 0,
If \lambda = 0 \Rightarrow A\vec{x} = \vec{0}

Since x not = 0, A is not linearly independent therefore not invertible.
Now you are "waving your arms." What does it mean to say that a single matrix is linearly independent?

You are given that 0 is an eigenvalue, so it must be that Ax = 0, for x != 0. What about |Ax|? Can you do something with that?
zeion said:
I suck at doing proves. Do I need to show it with general arbitrary variables..?

You get better at doing proofs (not proves) by doing proofs. Prove is a verb; proof is a noun.

BTW, problems like this really should go into the Calculus & Beyond section.
 
I'm not sure I follow what you mean.. maybe I can say something about the inverse of the matrix..? Like if the determinant of the matrix is 0 then it is not invertible?
 
Okay so since the columns of the matrix are not linearly independent (because the nullspace of the matrix does not only contain the zero vector), there will be a zero column and therefore the determinant of the matrix will be 0, therefore the matrix is not invertible?
 
You can do it much more simply. Since Ax = 0x, then (A - 0)x = 0. What does that say about the determinant of A - 0? Here 0 represents the nxn zero matrix.
 
The determinant of (A - 0) is 0 since it is not linearly independent..?
Or can I just say it is 0 because x is not 0?
 
I don't think you understand what linear independence means. You have reverted back to what you said in post 1.

Here is a matrix, A:
[0 1]
[0 0]
Here is a vector x:
[1]
[0]
Clearly A is not the zero matrix, and x is not the zero vector, yet Ax = 0, right?

Would you describe A as linearly dependent, linearly independent, or neither?
 
Can't I just look at the columns of the matrix and see if they can be written as some linear combination of the others..? So A is linearly dependent since 0(1,0) = (0,0)?
 
  • #10
What if you're not given any details about the columns of the matrix?

You are misusing or misunderstanding the concept of linear independence/dependence. You can describe the columns of a matrix as being linearly dependent or linearly independent, but you wouldn't describe matrices this way unless you're treating them as vectors.

Using the definition of linear independence, the matrix as I have defined it is linearly independent. The equation cA = 0 has only a single solution, namely c = 0.

What I have been trying to get you to realize is that |A - 0| = 0. That says something important about A, the nullspace of A, the rows of A, the columns of A, and also about the invertibility (or not) of A. Haven't you seen any theorem about the determinant of a square matrix and its invertibility?
 
  • #11
I'm not exactly sure but I remember finding the inverse of a matrix involves multiplying it by 1/ the determinant of the matrix, so if the determinant of the matrix is 0 then its inverse is not defined..?
 
  • #12
Just a note - remember that the determinant of a square matrix equals the product of its eigenvalues.
 
  • #13
IIRC, that's a technique for inverting 2 x 2 matrices. I don't think it applies to larger matrices.

Let's try another tack: You have Ax = 0 for nonzero x. What do you know about the nullspace of A?
 
  • #14
Umm for nonzero x the nullspace of A has more than just the zero vector so it has dimension more than 0?
 
  • #15
Yes, so what does that say about the invertibility of A?
 
  • #16
If dimension of nullspace > 0 that means there is at least one parameter, so the rank of A not = the dimension of A, and so A is not invertible..?
 
  • #17
Or in short, if dim(null(A)) > 0, then A is not invertible.

Going back to the OP, you have established that for an n X n matrix A, if 0 is an eigenvalue of A, then A is not invertible.

Now go the other way to show that A being non-invertible implies that 0 is an eigenvalue of A.
 
  • #18
Okay.. not sure how to do this haha

Given A not invertible then dim(null(A)) > 0, and

<br /> <br /> A = (A - 0) = (A - \lambda I) iff <br /> \lambda = 0<br /> <br />
 
  • #19
You need to flesh this out a bit. How does it use the fact that A is not invertible?
 
  • #20
Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues

What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?
 
  • #21
statdad said:
Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues
I'm not sure that the OP knows this.
statdad said:
What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?
 
  • #22
If A is not invertible then there is at least one zero in the diagonal and so unless lambda was zero, A would no longer be not invertible after subtracting lambda I.
 
  • #23
That's not true. Here's a counterexample for A:
[1 1]
[1 1]

It's not invertible, but there are no zeroes along the main diagonal.
 
  • #24
statdad said:
Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues

What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?

Does this have something to do with that thing where if I use the eigenvectors as the basis for a transformation matrix of a linear operator then the matrix is just a diagonal matrix with the corresponding eigenvalues along the diagonal?
 

Similar threads

Replies
14
Views
2K
Replies
19
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 18 ·
Replies
18
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
2
Views
2K
  • · Replies 69 ·
3
Replies
69
Views
9K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K