# Homework Help: Eigenvalue problem

1. Dec 9, 2016

### Mr Davis 97

1. The problem statement, all variables and given/known data
Let $A$ be an $n \times n$ matrix. Show that if $A^2$ is the zero matrix, then the only eigenvalue of $A$ is 0.

2. Relevant equations

3. The attempt at a solution
All eigenvalues and eigenvectors must satisfy the equation $A\vec{v} = \lambda \vec{v}$. Multiplying both sides by $A$, we have that $A \vec{v} = \vec{0}$. In this the correct direction? I am not sure where to go from here...

2. Dec 9, 2016

3. Dec 9, 2016

### Staff: Mentor

This is the correct direction.
What exactly do you get, when you apply $A$ to both sides?

4. Dec 9, 2016

### Mr Davis 97

Cocleia's post was helpful. I see that we get $\lambda^2 \vec{v} = 0$. But since eigenvectors can't be zero, $\lambda = 0$. Does this show that A must have an eigenvalue of 0, or does it show that if A has eigenvalues then the only eigenvalue would be 0?

5. Dec 9, 2016

### Staff: Mentor

The latter. But what is $A(w)$, with $w=A(v)$?

6. Dec 9, 2016

### Mr Davis 97

The zero vector, since $A(A(v)) = A^2 (v) = 0 (v) = 0$

7. Dec 9, 2016

### FactChecker

Use it to show what you need to show to prove the statement of the problem. Start with an arbitrary eigenvalue of A and show that it must be 0.

8. Dec 9, 2016

### Staff: Mentor

Yes. And this means that $A(w) = 0 \cdot w$ for all $w=A(v)$. So if $V$ itself isn't zero, there will be eigenvectors to the eigenvalue zero.
And as you've mentioned above, other eigenvalues aren't possible - over a field. If you consider vector spaces over a ring, that has elements with $\lambda^2=0$, then the proof fails.

9. Dec 9, 2016

### Mr Davis 97

So if $V$ isn't itself zero, this means that A must have eigenvalues. But in the case where $A^2 = 0$, the only possible eigenvalue is 0? So then A must have an eigenvalue of 0.

Does a matrix ever not have eigenvalues when V isn't zero?

10. Dec 9, 2016

### Staff: Mentor

We need $V \neq \{0\}$ because eigenvectors are defined to be unequal the zero vector. Since $A(0)=0$ is always the case, it wouldn't make much sense to allow $\vec{v}=0$ as an eigenvector.

Here we have $A^2=0$, which means that the entire image of the linear mapping that is represented by $A$ is contained in its kernel:
$im(A) = A(V) = \{w \in V \,\vert \, w=A(v) \text{ for some } v \in V\} \subseteq ker(A) = \{w \in V\,\vert \,A(w)=0\}$.
So all $A(w)\neq 0$ are eigenvectors of the eigenvalue $0$. This proofs existence.

Your argument $0=A^2(v)=A(A(v))=A(\lambda v)=\lambda^2 v$ shows, like you've said, that $\lambda = 0$ is the only possible, which proofs uniqueness.
In general it isn't guaranteed that a matrix has eigenvalues. E.g. you could have complex eigenvalues although the matrix (and scalar field) are the real numbers: $\begin{bmatrix}1&-2\\1&1\end{bmatrix}$. So one doesn't say $A$ has eigenvalues here, if only the reals are considered.

11. Dec 9, 2016

### Mr Davis 97

So in doing these types of problems, can we not necessarily start off by assuming that $A \vec{x} = \lambda \vec{x}$ is true for some $\lambda$ and nonzero $\vec{x}$? Do we first have to show that there exists some $\lambda$ and nonzero $\vec{x}$ such that $A \vec{x} = \lambda \vec{x}$?

For example if we were trying to show that $A^{-1}$ has the the multiplicative inverse of the eigenvalues of $A$, then before writing down $Ax = \lambda x$ and deriving $A^{-1} x = (1 / \lambda) x$, do we first have to show that $A x = \lambda x$ exists?

Last edited: Dec 9, 2016
12. Dec 9, 2016

### Staff: Mentor

Oh dear, that's a question about logic finesses, and it's pretty late here ...
I guess, you're right. If there aren't any eigenvalues of the required kind, then an expression like "the only eigenvalue is a tree" can be considered as true. However, I find it more meaningful to see, that there is actually an eigenvalue of the required kind, instead of only showing: "If there is an eigenvalue, then it has to be a tree."

In our case, $A^2=0$ it's pretty obvious, that both is true: $0$ exists in the field and there are actually eigenvectors to zero.
In the example I gave above, which is an invertible matrix and there are no real eigenvectors at all, we would have such a case like the one with the tree (which I choose to emphasize on a non-existing number): "If $\lambda$ is an eigenvector of $A$, then $\lambda^{-1}$ is an eigenvector of $A^{-1}$." is a true statement although neither exists in the reals.

13. Dec 9, 2016

### FactChecker

In the problem as stated, you do not have to prove that an eigenvalue exists. You can start by supposing (not proving) an eigenvalue and showing that it would have the properties desired ( = 0 ).