# Eigenvalue Problem: Show 0 is the Only Eigenvalue of A When A^2=0

• Mr Davis 97
In summary: V\} \subseteq ker(A) = \{w \in V\,\vert \,A(w)=0\}##. So all ##A(w)\neq 0## are eigenvectors of the eigenvalue ##0##.
Mr Davis 97

## Homework Statement

Let ##A## be an ##n \times n## matrix. Show that if ##A^2## is the zero matrix, then the only eigenvalue of ##A## is 0.

## The Attempt at a Solution

All eigenvalues and eigenvectors must satisfy the equation ##A\vec{v} = \lambda \vec{v}##. Multiplying both sides by ##A##, we have that ##A \vec{v} = \vec{0}##. In this the correct direction? I am not sure where to go from here...

Mr Davis 97 said:

## Homework Statement

Let ##A## be an ##n \times n## matrix. Show that if ##A^2## is the zero matrix, then the only eigenvalue of ##A## is 0.

## The Attempt at a Solution

All eigenvalues and eigenvectors must satisfy the equation ##A\vec{v} = \lambda \vec{v}##. Multiplying both sides by ##A##
This is the correct direction.
, we have that ##A \vec{v} = \vec{0}##. In this the correct direction? I am not sure where to go from here...
What exactly do you get, when you apply ##A## to both sides?

fresh_42 said:
This is the correct direction.

What exactly do you get, when you apply ##A## to both sides?
Cocleia's post was helpful. I see that we get ##\lambda^2 \vec{v} = 0##. But since eigenvectors can't be zero, ##\lambda = 0##. Does this show that A must have an eigenvalue of 0, or does it show that if A has eigenvalues then the only eigenvalue would be 0?

Mr Davis 97 said:
Cocleia's post was helpful. I see that we get ##\lambda^2 \vec{v} = 0##. But since eigenvectors can't be zero, ##\lambda = 0##. Does this show that A must have an eigenvalue of 0, or does it show that if A has eigenvalues then the only eigenvalue would be 0?
The latter. But what is ##A(w)##, with ##w=A(v)##?

fresh_42 said:
The latter. But what is ##A(w)##, with ##w=A(v)##?
The zero vector, since ##A(A(v)) = A^2 (v) = 0 (v) = 0##

Mr Davis 97 said:
Cocleia's post was helpful. I see that we get ##\lambda^2 \vec{v} = 0##. But since eigenvectors can't be zero, ##\lambda = 0##. Does this show that A must have an eigenvalue of 0, or does it show that if A has eigenvalues then the only eigenvalue would be 0?
Use it to show what you need to show to prove the statement of the problem. Start with an arbitrary eigenvalue of A and show that it must be 0.

Mr Davis 97 said:
The zero vector, since ##A(A(v)) = A^2 (v) = 0 (v) = 0##
Yes. And this means that ##A(w) = 0 \cdot w## for all ##w=A(v)##. So if ##V## itself isn't zero, there will be eigenvectors to the eigenvalue zero.
And as you've mentioned above, other eigenvalues aren't possible - over a field. If you consider vector spaces over a ring, that has elements with ##\lambda^2=0##, then the proof fails.

Mr Davis 97
fresh_42 said:
Yes. And this means that ##A(w) = 0 \cdot w## for all ##w=A(v)##. So if ##V## itself isn't zero, there will be eigenvectors to the eigenvalue zero.
And as you've mentioned above, other eigenvalues aren't possible - over a field. If you consider vector spaces over a ring, that has elements with ##\lambda^2=0##, then the proof fails.
So if ##V## isn't itself zero, this means that A must have eigenvalues. But in the case where ##A^2 = 0##, the only possible eigenvalue is 0? So then A must have an eigenvalue of 0.

Does a matrix ever not have eigenvalues when V isn't zero?

Mr Davis 97 said:
So if ##V## isn't itself zero, this means that A must have eigenvalues. But in the case where ##A^2 = 0##, the only possible eigenvalue is 0? So then A must have an eigenvalue of 0.
We need ##V \neq \{0\}## because eigenvectors are defined to be unequal the zero vector. Since ##A(0)=0## is always the case, it wouldn't make much sense to allow ##\vec{v}=0## as an eigenvector.

Here we have ##A^2=0##, which means that the entire image of the linear mapping that is represented by ##A## is contained in its kernel:
##im(A) = A(V) = \{w \in V \,\vert \, w=A(v) \text{ for some } v \in V\} \subseteq ker(A) = \{w \in V\,\vert \,A(w)=0\}##.
So all ##A(w)\neq 0## are eigenvectors of the eigenvalue ##0##. This proofs existence.

Your argument ##0=A^2(v)=A(A(v))=A(\lambda v)=\lambda^2 v## shows, like you've said, that ##\lambda = 0## is the only possible, which proofs uniqueness.
Does a matrix ever not have eigenvalues when V isn't zero?
In general it isn't guaranteed that a matrix has eigenvalues. E.g. you could have complex eigenvalues although the matrix (and scalar field) are the real numbers: ##\begin{bmatrix}1&-2\\1&1\end{bmatrix}##. So one doesn't say ##A## has eigenvalues here, if only the reals are considered.

Mr Davis 97
fresh_42 said:
We need ##V \neq \{0\}## because eigenvectors are defined to be unequal the zero vector. Since ##A(0)=0## is always the case, it wouldn't make much sense to allow ##\vec{v}=0## as an eigenvector.

Here we have ##A^2=0##, which means that the entire image of the linear mapping that is represented by ##A## is contained in its kernel:
##im(A) = A(V) = \{w \in V \,\vert \, w=A(v) \text{ for some } v \in V\} \subseteq ker(A) = \{w \in V\,\vert \,A(w)=0\}##.
So all ##A(w)\neq 0## are eigenvectors of the eigenvalue ##0##. This proofs existence.

Your argument ##0=A^2(v)=A(A(v))=A(\lambda v)=\lambda^2 v## shows, like you've said, that ##\lambda = 0## is the only possible, which proofs uniqueness.

In general it isn't guaranteed that a matrix has eigenvalues. E.g. you could have complex eigenvalues although the matrix (and scalar field) are the real numbers: ##\begin{bmatrix}1&-2\\1&1\end{bmatrix}##. So one doesn't say ##A## has eigenvalues here, if only the reals are considered.

So in doing these types of problems, can we not necessarily start off by assuming that ##A \vec{x} = \lambda \vec{x}## is true for some ##\lambda## and nonzero ##\vec{x}##? Do we first have to show that there exists some ##\lambda## and nonzero ##\vec{x}## such that ##A \vec{x} = \lambda \vec{x}##?

For example if we were trying to show that ##A^{-1}## has the the multiplicative inverse of the eigenvalues of ##A##, then before writing down ##Ax = \lambda x## and deriving ##A^{-1} x = (1 / \lambda) x##, do we first have to show that ##A x = \lambda x## exists?

Last edited:
Mr Davis 97 said:
So in doing these types of problems, can we not necessarily start off by assuming that ##A \vec{x} = \lambda \vec{x}## is true for some ##\lambda## and nonzero ##\vec{x}##? Do we first have to show that there exists some ##\lambda## and nonzero ##\vec{x}## such that ##A \vec{x} = \lambda \vec{x}##?

For example if we were trying to show that ##A^{-1}## has the the multiplicative inverse of the eigenvalues of ##A##, then before writing down ##Ax = \lambda x## and deriving ##A^{-1} x = (1 \ \lambda) x##, do we first have to show that ##A x = \lambda x## exists?
Oh dear, that's a question about logic finesses, and it's pretty late here ...
Mr Davis 97 said:
Show that if ##A^2## is the zero matrix, then the only eigenvalue of ##A## is ##0##.
I guess, you're right. If there aren't any eigenvalues of the required kind, then an expression like "the only eigenvalue is a tree" can be considered as true. However, I find it more meaningful to see, that there is actually an eigenvalue of the required kind, instead of only showing: "If there is an eigenvalue, then it has to be a tree."

In our case, ##A^2=0## it's pretty obvious, that both is true: ##0## exists in the field and there are actually eigenvectors to zero.
In the example I gave above, which is an invertible matrix and there are no real eigenvectors at all, we would have such a case like the one with the tree (which I choose to emphasize on a non-existing number): "If ##\lambda## is an eigenvector of ##A##, then ##\lambda^{-1}## is an eigenvector of ##A^{-1}##." is a true statement although neither exists in the reals.

Mr Davis 97 said:
So in doing these types of problems, can we not necessarily start off by assuming that ##A \vec{x} = \lambda \vec{x}## is true for some ##\lambda## and nonzero ##\vec{x}##? Do we first have to show that there exists some ##\lambda## and nonzero ##\vec{x}## such that ##A \vec{x} = \lambda \vec{x}##?
In the problem as stated, you do not have to prove that an eigenvalue exists. You can start by supposing (not proving) an eigenvalue and showing that it would have the properties desired ( = 0 ).

## 1. What is an eigenvalue problem?

An eigenvalue problem is a mathematical problem that involves finding the special values, called eigenvalues, that satisfy a certain equation when multiplied by a given matrix.

## 2. What does it mean when the eigenvalue is zero?

When an eigenvalue is zero, it means that the corresponding eigenvector is a null vector, or a vector of all zeros. This indicates that the matrix has no meaningful transformation on that particular vector.

## 3. How do you show that 0 is the only eigenvalue of A when A^2=0?

To show that 0 is the only eigenvalue of A when A^2=0, we can use the definition of eigenvalues and eigenvectors. Since A^2=0, this means that A multiplied by itself results in the zero matrix. Therefore, any nonzero vector multiplied by A will result in the zero vector, making 0 the only possible eigenvalue.

## 4. What does it mean when A^2=0 in the context of eigenvalues?

When A^2=0, it means that the matrix A is nilpotent, or that it can be raised to a certain power (in this case 2) and result in the zero matrix. This also means that the matrix A has no inverse, and therefore, 0 is the only eigenvalue.

## 5. Why is it important to show that 0 is the only eigenvalue of A when A^2=0?

Showing that 0 is the only eigenvalue of A when A^2=0 is important because it reveals important properties about the given matrix. It can indicate that the matrix has no meaningful transformation on any nonzero vector, or that it is not invertible. This information can be useful in solving other mathematical problems or applications in various fields such as physics and engineering.

• Calculus and Beyond Homework Help
Replies
5
Views
711
• Calculus and Beyond Homework Help
Replies
2
Views
433
• Calculus and Beyond Homework Help
Replies
11
Views
1K
• Calculus and Beyond Homework Help
Replies
24
Views
1K
• Calculus and Beyond Homework Help
Replies
2
Views
704
• Calculus and Beyond Homework Help
Replies
12
Views
1K
• Calculus and Beyond Homework Help
Replies
2
Views
513
• Calculus and Beyond Homework Help
Replies
5
Views
2K
• Calculus and Beyond Homework Help
Replies
19
Views
3K
• Calculus and Beyond Homework Help
Replies
2
Views
1K