# Eigenvalue Proof Help

1. May 4, 2012

### AsprngMathGuy

Hey everyone,

I have a problem with over thinking things quite often, so I once again need help haha.
How would you go about proving this:

λ=0 is the only eigenvalue of A $\Rightarrow$ Ax=0 $\forall$x

Any help would be appreciated!
Thanks

2. May 4, 2012

### robertsj

Are you sure it's true? The matrix
$$A = \left [\begin{array}{ccc} 0 & 1 & 1 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \\ \end{array} \right ]$$
has $\lambda = 0$ with multiplicity 3. Multiplying $A$ by the ones vector does not, however, yield the zero vector.

Perhaps you left out some constraints on $A$?

3. May 4, 2012

### chiro

Hey AsprngMathGuy and welcome to the forums.

The result can be shown from the definition of the eigenvalue problem where you have:

$Ax = λx$ which implies $(Ax - Iλx) = 0$

If all your λ's are zero then the above reduces to $Ax - 0*Ix = 0$ which gives us $Ax=0$. This only uses the definition of the eigenvalue problem and you simply plug in the value for λ to get your equation.

4. May 4, 2012

### chiro

That is not correct.

You have to change your matrix to this:

$$A = \left [\begin{array}{ccc} -λ & 1 & 1 \\ 0 & -λ & 1 \\ 0 & 0 & -λ \\ \end{array} \right ]$$ when you factor in the -λI term.

It still gives you zero (like you said above), but substituting in λ in your equation will give you Ax = 0.

You have to remember that you are solving an eigenvalue problem, and in doing this you want to usually assume your A is a non-singular matrix. In your example your A is a singular matrix and this will be useless.

The idea with an eigenvalue problem is that you want to find what x gets 'scaled' in a linearly dependent way from your matrix A, which will help you decompose it into the eigenvectors by analyzing where the scaling happens.

5. May 4, 2012

### HallsofIvy

Perhaps Chiro is misunderstanding the question. A, given by robertsj, is, in fact, an example of matrix having only A as eigenvalue but such that Ax is not always 0.

IF there exist a basis for the vector space consisting of all eigenvectors (a "complete set of eigenvectors" so that A is diagonalizale) then Ax= 0 for all x. But in general, there may not exist such a set of eigenvectors. There will be some subspace such that Ax= 0 for all x in that subspace. In robersj's example, that would be the subspace of vectors <1, 0, 0>.

6. May 4, 2012

### chiro

True HallsofIvy. Can't see the point for doing an eigendecomposition when you have no eigenvectors or can't find them.

7. May 4, 2012

### robertsj

chiro: you're absolutely right if A were to be invertible, but that would be an unspecified constraint on A. As for assuming a nonsingular matrix, I work all the time with a method dealing with singular operators.

8. May 4, 2012

### chiro

What kind of problems? Just curious.

9. May 4, 2012

### DonAntonio

....

10. May 4, 2012

### AsprngMathGuy

Thank you so much for your input. I actually misread the problem slightly. At the beginning it states to prove OR disprove the problem. So of course my mind goes right to attempting to prove it haha. Thanks!

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook