# Homework Help: Prove that a square matrix is not invertible iff 0 is an eigenvalue of A

1. Feb 16, 2010

### zeion

1. The problem statement, all variables and given/known data

Prove that a square matrix is not invertible if and only if 0 is an eigenvalue of A.

2. Relevant equations

3. The attempt at a solution

Given:
$$A\vec{x} = \lambda\vec{x} \Rightarrow A\vec{x} - \lambda\vec{x} = \vec{0} \Rightarrow (A - \lambda I)\vec{x} = \vec{0}$$
By definition x not = 0,
If $$\lambda = 0 \Rightarrow A\vec{x} = \vec{0}$$

Since x not = 0, A is not linearly independent therefore not invertible.

I suck at doing proves. Do I need to show it with general arbitrary variables..?

2. Feb 16, 2010

### Staff: Mentor

Now you are "waving your arms." What does it mean to say that a single matrix is linearly independent?

You are given that 0 is an eigenvalue, so it must be that Ax = 0, for x != 0. What about |Ax|? Can you do something with that?
You get better at doing proofs (not proves) by doing proofs. Prove is a verb; proof is a noun.

BTW, problems like this really should go into the Calculus & Beyond section.

3. Feb 16, 2010

### zeion

I'm not sure I follow what you mean.. maybe I can say something about the inverse of the matrix..? Like if the determinant of the matrix is 0 then it is not invertible?

4. Feb 16, 2010

### Staff: Mentor

Yes, exactly.

5. Feb 16, 2010

### zeion

Okay so since the columns of the matrix are not linearly independent (because the nullspace of the matrix does not only contain the zero vector), there will be a zero column and therefore the determinant of the matrix will be 0, therefore the matrix is not invertible?

6. Feb 16, 2010

### Staff: Mentor

You can do it much more simply. Since Ax = 0x, then (A - 0)x = 0. What does that say about the determinant of A - 0? Here 0 represents the nxn zero matrix.

7. Feb 16, 2010

### zeion

The determinant of (A - 0) is 0 since it is not linearly independent..?
Or can I just say it is 0 because x is not 0?

8. Feb 16, 2010

### Staff: Mentor

I don't think you understand what linear independence means. You have reverted back to what you said in post 1.

Here is a matrix, A:
[0 1]
[0 0]
Here is a vector x:
[1]
[0]
Clearly A is not the zero matrix, and x is not the zero vector, yet Ax = 0, right?

Would you describe A as linearly dependent, linearly independent, or neither?

9. Feb 16, 2010

### zeion

Can't I just look at the columns of the matrix and see if they can be written as some linear combination of the others..? So A is linearly dependent since 0(1,0) = (0,0)?

10. Feb 16, 2010

### Staff: Mentor

What if you're not given any details about the columns of the matrix?

You are misusing or misunderstanding the concept of linear independence/dependence. You can describe the columns of a matrix as being linearly dependent or linearly independent, but you wouldn't describe matrices this way unless you're treating them as vectors.

Using the definition of linear independence, the matrix as I have defined it is linearly independent. The equation cA = 0 has only a single solution, namely c = 0.

What I have been trying to get you to realize is that |A - 0| = 0. That says something important about A, the nullspace of A, the rows of A, the columns of A, and also about the invertibility (or not) of A. Haven't you seen any theorem about the determinant of a square matrix and its invertibility?

11. Feb 16, 2010

### zeion

I'm not exactly sure but I remember finding the inverse of a matrix involves multiplying it by 1/ the determinant of the matrix, so if the determinant of the matrix is 0 then its inverse is not defined..?

12. Feb 16, 2010

Just a note - remember that the determinant of a square matrix equals the product of its eigenvalues.

13. Feb 16, 2010

### Staff: Mentor

IIRC, that's a technique for inverting 2 x 2 matrices. I don't think it applies to larger matrices.

Let's try another tack: You have Ax = 0 for nonzero x. What do you know about the nullspace of A?

14. Feb 16, 2010

### zeion

Umm for nonzero x the nullspace of A has more than just the zero vector so it has dimension more than 0?

15. Feb 16, 2010

### Staff: Mentor

Yes, so what does that say about the invertibility of A?

16. Feb 16, 2010

### zeion

If dimension of nullspace > 0 that means there is at least one parameter, so the rank of A not = the dimension of A, and so A is not invertible..?

17. Feb 16, 2010

### Staff: Mentor

Or in short, if dim(null(A)) > 0, then A is not invertible.

Going back to the OP, you have established that for an n X n matrix A, if 0 is an eigenvalue of A, then A is not invertible.

Now go the other way to show that A being non-invertible implies that 0 is an eigenvalue of A.

18. Feb 16, 2010

### zeion

Okay.. not sure how to do this haha

Given A not invertible then dim(null(A)) > 0, and

$$A = (A - 0) = (A - \lambda I)$$ iff $$\lambda = 0$$

19. Feb 17, 2010

### Staff: Mentor

You need to flesh this out a bit. How does it use the fact that A is not invertible?

20. Feb 17, 2010

Once again - you folks are taking an incredibly difficult path to this:

Determinant= product of eigenvalues

What do you know about the invertability of the matrix in relation to the determinant?
what must be true about the eigen values if the determinant is zero?

21. Feb 17, 2010

### Staff: Mentor

I'm not sure that the OP knows this.

22. Feb 17, 2010

### zeion

If A is not invertible then there is at least one zero in the diagonal and so unless lambda was zero, A would no longer be not invertible after subtracting lambda I.

23. Feb 17, 2010

### Staff: Mentor

That's not true. Here's a counterexample for A:
[1 1]
[1 1]

It's not invertible, but there are no zeroes along the main diagonal.

24. Feb 17, 2010

### zeion

Does this have something to do with that thing where if I use the eigenvectors as the basis for a transformation matrix of a linear operator then the matrix is just a diagonal matrix with the corresponding eigenvalues along the diagonal?