Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Simple Proof for the existence of eigenvector

  1. Mar 21, 2014 #1
    Hello,

    My question is this. Is it possible to prove that there exist an eigenvectors for a symmetric matrix without discussing about what eigenvalues are and going into details with characteristic equations, determinants, and so on? This my short proof for that: (The only assumption is ##A## is symmetric)

    Suppose there doesn't exist any vector so that ##Av = \lambda v##. This then happens ##Av = b_{1}##. ##A^{T}v = b_{2}.## Clearly ##b_{1} \neq b_{2}## and thus ## Av \neq A^{T}v##. But this implies ##A^{T} \neq A##, whereas we assume ##A## is symmetric. Thus ##v## must exist, as required.

    Is this proof legit? It may be too simple but I'm not certain. The alternative would be showing that if the determinant of ##det(A−\lambda I)=0## then ##v## must exist. But then there's no mention of the symmetric property of matrix in this case.
     
  2. jcsd
  3. Mar 21, 2014 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Why does [itex]Av= b_1[/itex], [itex]A^tv= b_2[/itex] immediately imply that [itex]b_1\ne b_2[/itex]? What does that have to do with eigenvalues?


    I think the difficulty is that you are not really clear on what you are trying to prove! If you are thinking of the vector space over the complex numbers, every matrix has an eigenvalue because of the "fundamental theorem of algebra"- every polynomial equation has at least one (complex) solution. What is true here is that the eigenvalues of a symmetric matrix are real numbers.

    Without going into "detail" about determinants, if A is an n by n matrix, then [itex]det(A- \lambda I)= 0[/itex] is an nth order polynomial equation and so has n (not necessarily distinct) complex roots.

    I presume you are talking about a space over the real numbers since otherwise we are done- every matrix has at least one complex eigenvalue. To show that the eigenvalues of a symmetric matrix are real numbers, let [itex]\lambda[/itex] be a (possibly complex) eigenvalue of matrix A. Then there exist a unit vector v such that [itex]Av= \lambda v[/itex].

    Now, writing "<u, v>" for the inner product of vectors u and v, [itex]\lambda= \lambda\cdot 1= \lambda<v, v>= <\lambda v, v>[/itex][itex]= <Av, v>= <v, Av>= <v,\lambda v>= \overline{<\lambda v, v>}= \overline{\lambda}<v, v>= \overline{\lambda}[/itex].
     
  4. Mar 21, 2014 #3
    Yeah you're right, I don't think I'm sure myself on what I'm trying to prove. I've already had a proof similar to yours that show a symmetric matrix will have real eigenvalues. The thing is I don't know whether proving that will result in showing that eigenvector exist. Now I'm sure the it's equivalent, because clearly if real ##\lambda## doesn't exist then a suitable ##v## will not exist as well.
     
  5. Mar 25, 2014 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    You already know the proof that Eigenvalues exist but you don't know about Eigenvectors?

    How can you possibly have and eigenvalue without a corresponding eigenvector? That's pretty much what the definition of "eigenvalue" says isn't it?
     
  6. Mar 25, 2014 #5

    pasmith

    User Avatar
    Homework Helper

    The whole point of the concept is that [itex]\lambda[/itex] is an eigenvalue of [itex]A[/itex] if and only if there exists [itex]v \neq 0[/itex] such that [itex]Av = \lambda v[/itex].

    Thus eigenvalues exist if and only if eigenvectors exist.
     
  7. Mar 25, 2014 #6
    Yes I'm sorry I didn't analyze the definition carefully.
     
  8. Mar 27, 2014 #7

    mathwonk

    User Avatar
    Science Advisor
    Homework Helper

    here is a standard proof from my web page:

    “Spectral theorem” (real symmetric matrices are orthogonally diagonalizable) Thm: If k = R and A = A*, then Rn has a basis of mutually orthogonal eigenvectors of A.
    Pf: The real valued function f(x) = Ax.x has a maximum on the unit sphere in Rn, at some point y where the gradient df of f is "zero", i.e. df(y) is perpendicular to the tangent space of the sphere
    at y. The tangent space at y is the subspace of vectors in Rn perpendicular to y, and df(y) = 2Ay. Hence Ay is perpendicular to the tangent space at y, i.e. Ay = 0 or Ay is parallel to y, so Ay = cy for some c, and y is an eigenvector for A.
    Now restrict A to the subspace V of vectors orthogonal to y. If v.y = 0, then Av.y = v.Ay = v.cy = c(v.y) = 0. Hence A preserves V. A still has the property Av.x = v.Ax on V, so the restriction of A to V has an eigenvector in V. (Although V has no natural representation as
    Rn-1, the argument for producing an eigenvector depended only the symmetry property Av.x = v.Ax.) Repeating, A has an eigenbasis. QED.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Simple Proof for the existence of eigenvector
Loading...