Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvalue Proof Help

  1. May 4, 2012 #1
    Hey everyone,

    I have a problem with over thinking things quite often, so I once again need help haha.
    How would you go about proving this:

    λ=0 is the only eigenvalue of A [itex]\Rightarrow[/itex] Ax=0 [itex]\forall[/itex]x

    Any help would be appreciated!
    Thanks
     
  2. jcsd
  3. May 4, 2012 #2
    Are you sure it's true? The matrix
    [tex]
    A = \left [\begin{array}{ccc}
    0 & 1 & 1 \\
    0 & 0 & 1 \\
    0 & 0 & 0 \\
    \end{array} \right ]
    [/tex]
    has [itex] \lambda = 0 [/itex] with multiplicity 3. Multiplying [itex] A [/itex] by the ones vector does not, however, yield the zero vector.

    Perhaps you left out some constraints on [itex]A[/itex]?
     
  4. May 4, 2012 #3

    chiro

    User Avatar
    Science Advisor

    Hey AsprngMathGuy and welcome to the forums.

    The result can be shown from the definition of the eigenvalue problem where you have:

    [itex]Ax = λx[/itex] which implies [itex](Ax - Iλx) = 0[/itex]

    If all your λ's are zero then the above reduces to [itex]Ax - 0*Ix = 0[/itex] which gives us [itex]Ax=0[/itex]. This only uses the definition of the eigenvalue problem and you simply plug in the value for λ to get your equation.
     
  5. May 4, 2012 #4

    chiro

    User Avatar
    Science Advisor

    That is not correct.

    You have to change your matrix to this:

    [tex]
    A = \left [\begin{array}{ccc}
    -λ & 1 & 1 \\
    0 & -λ & 1 \\
    0 & 0 & -λ \\
    \end{array} \right ]
    [/tex] when you factor in the -λI term.

    It still gives you zero (like you said above), but substituting in λ in your equation will give you Ax = 0.

    You have to remember that you are solving an eigenvalue problem, and in doing this you want to usually assume your A is a non-singular matrix. In your example your A is a singular matrix and this will be useless.

    The idea with an eigenvalue problem is that you want to find what x gets 'scaled' in a linearly dependent way from your matrix A, which will help you decompose it into the eigenvectors by analyzing where the scaling happens.
     
  6. May 4, 2012 #5

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Perhaps Chiro is misunderstanding the question. A, given by robertsj, is, in fact, an example of matrix having only A as eigenvalue but such that Ax is not always 0.

    IF there exist a basis for the vector space consisting of all eigenvectors (a "complete set of eigenvectors" so that A is diagonalizale) then Ax= 0 for all x. But in general, there may not exist such a set of eigenvectors. There will be some subspace such that Ax= 0 for all x in that subspace. In robersj's example, that would be the subspace of vectors <1, 0, 0>.
     
  7. May 4, 2012 #6

    chiro

    User Avatar
    Science Advisor

    True HallsofIvy. Can't see the point for doing an eigendecomposition when you have no eigenvectors or can't find them.
     
  8. May 4, 2012 #7
    chiro: you're absolutely right if A were to be invertible, but that would be an unspecified constraint on A. As for assuming a nonsingular matrix, I work all the time with a method dealing with singular operators.
     
  9. May 4, 2012 #8

    chiro

    User Avatar
    Science Advisor

    What kind of problems? Just curious.
     
  10. May 4, 2012 #9
    ....
     
  11. May 4, 2012 #10
    Thank you so much for your input. I actually misread the problem slightly. At the beginning it states to prove OR disprove the problem. So of course my mind goes right to attempting to prove it haha. Thanks!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Eigenvalue Proof Help
  1. Help with a proof (Replies: 2)

  2. Help with a proof (Replies: 2)

  3. Help with a proof. (Replies: 5)

Loading...