1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Extracting eigenvectors from a matrix

  1. Jan 2, 2014 #1
    Hello,

    1. The problem statement, all variables and given/known data
    I want to show that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors.

    $$
    \begin{pmatrix}
    A & H\\
    H & B
    \end{pmatrix}
    $$

    3. The attempt at a solution
    For the matrix shown above it's clear that the charateristic equation will be
    ##\lambda^2-\lambda(A+B)+AB-H^2=0##

    I can show that the discriminant of the quadratic equation will be greater than 0 implying that the eigenvalues must be real.
    ##b^2-4ac=(A+B)^2-4(AB-H^2)=A^2+2AB+B^2-4AB+4H^2##
    ##=(A-B)^2+4H^2##
    Since ##A, B, H \in \mathbb{R}##, ##(A-B)^2+4H^2 \geq 0##

    Knowing that ##\lambda## must be real for this matrix.

    My only problem now is to show that the eigenvector is orthogonal.

    The matrix has eigenvalues of ##\lambda_1, \lambda_2##, and hence eigenvectors ##\lambda_1v_1, \lambda_2v_2##.

    How can I show,

    ##\lambda_1\lambda_2x_1x_2+\lambda_1\lambda_2x_1x_2=0##?

    I know ##\lambda_1\lambda_2=det(M)##

    It could become,

    ##det(M)(x_1x_2+y_1y_2)=0##

    Then it's clear the vectors are orthogonal because ##det(M)## cannot be 0. But the problem is this is not a proof because I explicitly assume the dot product are 0 in the first place..

    I tried substituting the complete quadratic equation into the matrix as if I know the lambda but then the matrix cannot be eliminated in simple manner and I got a mess real quick.
     
  2. jcsd
  3. Jan 2, 2014 #2

    vela

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    The eigenvectors are ##v_1## and ##v_2##, not ##\lambda_1 v_1## and ##\lambda_2 v_2##. This matters because the eigenvalue could be 0 and ##\vec{0}## can't be an eigenvector by definition.

    det(M) could be 0 if either of the eigenvalues is 0.


    Assume ##v_1## and ##v_2## are eigenvectors corresponding to distinct eigenvalues, and then consider the dot products ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2##. Using the fact that M is symmetric, you can show the two products are equal.
     
  4. Jan 2, 2014 #3
    Ok I'll keep the important notation in mind, it never occurred to me that ##\vec{0}## is not valid.

    I also forget about the case if eigenvalues is 0.

    So in matrix notation ##v_1 \cdot M v_2## can be written as ##v_1^{\top}Mv_2## and ##(M v_1)\cdot v_2## as ##(Mv_1)^{\top}v_2##.

    ##(Mv_1)^{\top}v_2## is by transpose theorem and the fact that ##M## is symmetric, ##v_1^{\top}Mv_2##. Because ##v_2## is eigenvector, we can get ##v_1^{\top}v_2=v_1^{\top}v_2## equality, which implies that the dot product is zero. Is this correct?

    Edit, I forget that applying the transformation matrix will result in unknown factor of ##\lambda## instead of just 1.

    So Because ##v_2## is eigenvector, we can get ##\lambda_1v_1^{\top}v_2## and ##\lambda_2v_1^{\top}v_2## after applying the transformation matrix, hence ##\lambda_1v_1^{\top}v_2=\lambda_2v_1^{\top}v_2##, and that implies the dot product is 0 because there's two different eigenvalues.
     
    Last edited: Jan 2, 2014
  5. Jan 2, 2014 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Actually that's a bit of an argument and I take the other side. Yes, the definition of "eigenvalue" is "[itex]\lambda[/itex]" is an eigenvalue of linear operator A if and only if there exist a non-zero vector, v, such that [itex]Av= \lambda v[/itex].

    But some texts books define "eigenvector corresponding to eigenvalue [itex]\lambda[/itex]" as "a non-zero vector, v, such that [itex]Av= \lambda v[/itex] while other text books do NOT require "non-zero". I prefer the latter because with the former you have to keep saying "and the 0 vector" in statements about eigenvectors. For example, I think it is preferable to be able to say "the set of all eigenvectors corresponding to eigenvalue [itex]\lambda[/itex] form a vector space" rather than "the set of all eigenvalues together with the zero vector".

    In practice, of course, it doesn't make any difference. We still need to use non-zero eigenvectors to form a basis of that subspace.
     
  6. Jan 2, 2014 #5
    I hope I can fully appreciate this difference in definition as I go further with my study. It hasn't yet sunk in now. Maybe it's because the textbook is not aimed for rigorous study of linear algebra.

    Additionally if one of the mentors is still reading this thread, I'd like to know if this method of dot product ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2## can be used to prove the fact about eigenvectors in other cases e.g when M is real but not symmetric, and I want to show that the dot product is not equal, or when M is Hermitian.
     
  7. Jan 2, 2014 #6

    vela

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Education Advisor

    You've shown if the eigenvalues are distinct, the eigenvectors are orthogonal. Now you have to deal with the case where ##\lambda_1 = \lambda_2##. The first thing you need to consider is whether you can find two independent eigenvectors in this case.
     
  8. Jan 2, 2014 #7
    If ##\lambda_1 = \lambda_2##, then ##H## must be 0, and ##A=B##. This is will result in ##kI## matrix, where ##k## is a constant and ##I## unit matrix, ##\lambda_1 = \lambda_2=k## Because it's a multiple of unit matrix, any vector in the vector space considered is eigenvector. In two dimensional case we can find two orthogonal basis vector like ##(0,1)## and ##(1,0)##.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Extracting eigenvectors from a matrix
Loading...