Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Must every linear operator have eigenvalues? If so, why?

  1. Aug 19, 2009 #1
    It seems to me that http://en.wikipedia.org/wiki/Schur_decomposition" [Broken] relies on the fact that every linear operator must have at least one eigenvalue...but how do we know this is true?

    I have yet to find a linear operator without eigenvalues, so I believe every linear operator does have at least one eigenvalue.

    Still how does one prove it?

    Since we are looking for solutions to (A-Ia)|V>=|0>, wouldn't it be possible the A-Ia is always nonsingular and that the equation has only trivial solutions?
     
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. Aug 19, 2009 #2

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    On a finite-dimensional vector space V over the complex numbers, it should be obvious that any linear operator must have eigenvalues, although some or all of those eigenvalues might be zero. Since the operator is a map from V to itself (or a subset), one can use the pigeonhole principle to show that at least one vector in V must be parallel to its image.

    Over the real numbers, some operators do not have eigenvalues (e.g. rotation matrices in R^2), because the eigenvalues happen to be complex. But I don't know if this really counts.

    On an infinite-dimensional vector space, they may be other subtleties that allow an exception. For example, in quantum mechanics, the momentum operator is linear, but its eigenstates are not normalizable, and hence not technically part of the Hilbert space. That is, the eigenvectors are not actually members of the vector space, and so it might be reasonable to say that the corresponding eigenvalues do not actually exist.
     
  4. Aug 19, 2009 #3
    Point for Clarification: How does on use the pigeonhole principle in this case? It seems like you could be mapping an uncountably infinite number of vectors (though from a space of finite dimension).

    What are the pigeonholes? (the vector directions? I don't really trust intuitions based on the arrow representations of vectors)

    ....and how do you assign the images of the operation to them?

    ---------

    I just thought of something. If we can prove that a complex polynomial must have complex roots, then we can apply it to the characteristic polynomial.

    ...and this is, I believe (for non-constant polynomials), The Fundamental Theorem of Algebra.

    http://en.wikipedia.org/wiki/Fundamental_theorem_of_algebra

    Because of the diagonal entries of |A-aI|, I believe, the determinant will have to be a non-costant polynomial
     
    Last edited: Aug 19, 2009
  5. Aug 19, 2009 #4
    For every Eigenvalue must there be a non-trivial eigenvector? If so, why?

    So then the next question becomes:

    For every Eigenvalue must there be a non-trivial eigenvector? If so, why?
     
  6. Aug 19, 2009 #5

    Reb

    User Avatar

    Re: For every Eigenvalue must there be a non-trivial eigenvector? If so, why?


    I think you cannot have one without the other. It is in the definition.
     
  7. Aug 19, 2009 #6
    Re: For every Eigenvalue must there be a non-trivial eigenvector? If so, why?

    I wonder why the science advisers and pf mentors have so far not replied this thread. Too basic?

    I am actually not sure I answered my first question fully, since I did not rigorously prove that the characteristic polynomial is non-constant.

    Is it? I think its slightly more subtle. Ax=ax, x could be trivial, that is |0>, even if there are non-trivial a, such that |A-aI|=0.

    The solution to (A-aI)x=|0>, could have both parts be 0, no?
     
  8. Aug 19, 2009 #7

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    No, part of the definition is that Ax = ax holds for some nontrivial x.

    The characteristic polynomial cannot be constant. This is easy to prove by expanding the determinant |A-aI| by minors. Specifically, if A is an NxN matrix, then what is the coefficient of a^N?

    What are we, chopped liver? And as far as I know, there is nobody being paid to sit at their desk and read everything that happens on PF so that they can help everybody who asks.
     
  9. Aug 19, 2009 #8
    http://golem.ph.utexas.edu/category/2007/05/linear_algebra_done_right.html

    The pigeon holes are the dimensions of the space. We've only got n dimensions. If we apply a transformation T over and over again to a point, we'll produce a linearly dependent set of vectors. I only have a fluff idea of how the rest of the argument works, but I got that much of it. The articles explains the rest.
     
  10. Aug 19, 2009 #9
    Well then the proof by the fundamental theorem of algebra falls short of proving the exisence of an eigenvalue then doesn't it?

    Good point!

    I was just curious, but I appreciate yours and the others' help.

    I'll try and decipher this after work.
     
  11. Aug 19, 2009 #10
    Are we talking in general or just numeral matrices under the reals?
     
  12. Aug 19, 2009 #11
    General complex linear operators.

    ---
    Silly me. If |A-aI|=0 then A-aI is sigular, and therefore not of full rank, and therefore has a (non trivial) null-space, which means A has an eigenvector.

    Forgive me, it has been 13 years since I took linear algebra, and 12 years since Complex Analysis.
     
    Last edited: Aug 19, 2009
  13. Aug 20, 2009 #12

    morphism

    User Avatar
    Science Advisor
    Homework Helper

    By the way, the question of whether every linear operator (on a finite-dimensional F-vector space) has an eigenvalue is actually equivalent to the question of whether every polynomial with coefficients in F has a root in F.

    I.e. if you can prove what the OP is asking for (over [itex]\mathbb{C}[/itex]) without using the fundamental theorem of algebra, then you can actually use your argument to prove the FTA!
     
    Last edited: Aug 20, 2009
  14. Aug 29, 2009 #13
    For finite dimensional vector space there is a classic theorem. It shows that your question boils down to whether a certain polynomial has roots in the underlying scalar field of the vector space. If the field is algebraically close-e.g. the complex numbers-then there is always a root.

    If the field is not algebraically closed then there may not be eigenvalues. For instance rotation of the plane by 90 degrees has no real eigenvalues.
     
    Last edited by a moderator: May 4, 2017
  15. Aug 31, 2009 #14
    I started to think about this problem with the following quasi-informal statement

    "Every linear operator can be represented as a matrix..."

    which is for my taste, a very very very beautiful theorem. By the way I originally misstated it. The fields and the spaces are left out for a good reason. to just focus on the necessary part
     
  16. Aug 31, 2009 #15

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Every linear operator, over finite dimensional vector space V, can be represented by a matrix, given a specific basis on that vector space. (The matrix depends upon the choice of basis.) Given that, then you can write the "characteristic equation" for the matrix as a polynomial equation and the use the fundamental theorem of algebra: that every such equation over the complex numbers has a solution.
     
  17. Sep 2, 2009 #16
    Rotation by 90 degrees in the plane has no real eigenvalues - only complex.
    Why do you think that every linear operator has at least one eigen value? Can you show me your argument?
     
    Last edited by a moderator: May 4, 2017
  18. Sep 2, 2009 #17
    In my opinion, the most revealing way to see this theorem for finite dimensional vector spaces is the following classical argument.

    If L:V -> V is a linear map of the n dimensional vector space,V, then the vectors L^n(v),L^n-1(v),....,L(v),V are linearlly dependent so some linear combination of them is zero. This linear combination says that some polynomial in L sends V to zero. Consider the ideal generated by all such polynomials.

    Since the ring of polynomials over a field is a principal ideal domain, there is a polynomial that divides all of the others in this ideal. This polynomial must be zero on the entire vector space since the above construction using the vector,V, can be performed on any vector.

    If this polynomial it has a root over F, then it factors as (L - a)P. The linear factor (L - a) must be zero on some non-zero vector so a is an eigen-value and the vector is an eigen-vecor.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Must every linear operator have eigenvalues? If so, why?
  1. Linear operator (Replies: 6)

Loading...