Prove Eigenvalue λ=0 is Only Solution to Ax=0 for All x

In summary, the original problem is to prove or disprove that λ=0 is the only eigenvalue of A, which implies Ax=0 for all x. However, a counterexample is presented where a matrix A with only λ=0 as an eigenvalue does not necessarily result in Ax=0 for all x. Therefore, the original problem is disproved.
  • #1
AsprngMathGuy
2
0
Hey everyone,

I have a problem with over thinking things quite often, so I once again need help haha.
How would you go about proving this:

λ=0 is the only eigenvalue of A [itex]\Rightarrow[/itex] Ax=0 [itex]\forall[/itex]x

Any help would be appreciated!
Thanks
 
Physics news on Phys.org
  • #2
Are you sure it's true? The matrix
[tex]
A = \left [\begin{array}{ccc}
0 & 1 & 1 \\
0 & 0 & 1 \\
0 & 0 & 0 \\
\end{array} \right ]
[/tex]
has [itex] \lambda = 0 [/itex] with multiplicity 3. Multiplying [itex] A [/itex] by the ones vector does not, however, yield the zero vector.

Perhaps you left out some constraints on [itex]A[/itex]?
 
  • #3
AsprngMathGuy said:
Hey everyone,

I have a problem with over thinking things quite often, so I once again need help haha.
How would you go about proving this:

λ=0 is the only eigenvalue of A [itex]\Rightarrow[/itex] Ax=0 [itex]\forall[/itex]x

Any help would be appreciated!
Thanks

Hey AsprngMathGuy and welcome to the forums.

The result can be shown from the definition of the eigenvalue problem where you have:

[itex]Ax = λx[/itex] which implies [itex](Ax - Iλx) = 0[/itex]

If all your λ's are zero then the above reduces to [itex]Ax - 0*Ix = 0[/itex] which gives us [itex]Ax=0[/itex]. This only uses the definition of the eigenvalue problem and you simply plug in the value for λ to get your equation.
 
  • #4
robertsj said:
Are you sure it's true? The matrix
[tex]
A = \left [\begin{array}{ccc}
0 & 1 & 1 \\
0 & 0 & 1 \\
0 & 0 & 0 \\
\end{array} \right ]
[/tex]
has [itex] \lambda = 0 [/itex] with multiplicity 3. Multiplying [itex] A [/itex] by the ones vector does not, however, yield the zero vector.

Perhaps you left out some constraints on [itex]A[/itex]?

That is not correct.

You have to change your matrix to this:

[tex]
A = \left [\begin{array}{ccc}
-λ & 1 & 1 \\
0 & -λ & 1 \\
0 & 0 & -λ \\
\end{array} \right ]
[/tex] when you factor in the -λI term.

It still gives you zero (like you said above), but substituting in λ in your equation will give you Ax = 0.

You have to remember that you are solving an eigenvalue problem, and in doing this you want to usually assume your A is a non-singular matrix. In your example your A is a singular matrix and this will be useless.

The idea with an eigenvalue problem is that you want to find what x gets 'scaled' in a linearly dependent way from your matrix A, which will help you decompose it into the eigenvectors by analyzing where the scaling happens.
 
  • #5
Perhaps Chiro is misunderstanding the question. A, given by robertsj, is, in fact, an example of matrix having only A as eigenvalue but such that Ax is not always 0.

IF there exist a basis for the vector space consisting of all eigenvectors (a "complete set of eigenvectors" so that A is diagonalizale) then Ax= 0 for all x. But in general, there may not exist such a set of eigenvectors. There will be some subspace such that Ax= 0 for all x in that subspace. In robersj's example, that would be the subspace of vectors <1, 0, 0>.
 
  • #6
True HallsofIvy. Can't see the point for doing an eigendecomposition when you have no eigenvectors or can't find them.
 
  • #7
chiro: you're absolutely right if A were to be invertible, but that would be an unspecified constraint on A. As for assuming a nonsingular matrix, I work all the time with a method dealing with singular operators.
 
  • #8
robertsj said:
chiro: you're absolutely right if A were to be invertible, but that would be an unspecified constraint on A. As for assuming a nonsingular matrix, I work all the time with a method dealing with singular operators.

What kind of problems? Just curious.
 
  • #9
robertsj said:
chiro: you're absolutely right if A were to be invertible, but that would be an unspecified constraint on A.


*** Not only "unspecified constraint" but also nonsensical: a (square, of course) matrix is singular iff zero is one of its eigenvalues, so

in your problem A must be singular and, in fact, nilpotent.

DonAntonio ***




As for assuming a nonsingular matrix, I work all the time with a method dealing with singular operators.

...
 
  • #10
Thank you so much for your input. I actually misread the problem slightly. At the beginning it states to prove OR disprove the problem. So of course my mind goes right to attempting to prove it haha. Thanks!
 

1. What does it mean for an eigenvalue to equal 0?

An eigenvalue is a number associated with a matrix that represents how the matrix stretches or compresses a vector when multiplied by it. An eigenvalue of 0 means that the matrix does not stretch or compress the vector at all, and the resulting vector is the same as the original vector.

2. Why is it important to prove that λ=0 is the only solution for Ax=0?

Proving that λ=0 is the only solution for Ax=0 is important because it demonstrates that the matrix A has a trivial null space, meaning that the only vector that gets mapped to the zero vector is the zero vector itself. This is a fundamental concept in linear algebra and has many important applications in fields such as physics, engineering, and computer science.

3. How do you prove that λ=0 is the only solution for Ax=0?

To prove that λ=0 is the only solution for Ax=0, we need to show that there is no other value of λ that satisfies the equation besides 0. This can be done by using the definition of an eigenvalue, which states that if a vector x is an eigenvector of a matrix A with eigenvalue λ, then Ax=λx. If we substitute λ=0 into this equation, we get Ax=0, which is the same as our original equation. This shows that 0 is the only possible solution for Ax=0.

4. Are there any exceptions to this proof?

No, there are no exceptions to this proof. The proof holds true for any square matrix A, regardless of its size or the values of its entries. This is because the definition of an eigenvalue and the properties of matrix multiplication are universal concepts that apply to all matrices.

5. What real-world applications does this proof have?

This proof has many real-world applications in fields such as physics, engineering, and computer science. For example, in physics, matrices are used to represent transformations in space, and the concept of eigenvalues is essential in understanding how these transformations affect physical quantities such as energy and momentum. In engineering, matrices are used to model systems, and the proof that λ=0 is the only solution for Ax=0 is important in analyzing the stability and behavior of these systems. In computer science, matrices are used in many algorithms, and understanding eigenvalues is crucial in optimizing these algorithms for speed and accuracy.

Similar threads

Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
855
  • Linear and Abstract Algebra
Replies
4
Views
2K
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top