1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvectors and orthogonal basis

  1. Feb 19, 2017 #1
    1. The problem statement, all variables and given/known data
    I have a linear transformation ##\mathbb{R}^3 \rightarrow \mathbb{R}^3##. The part that asks for a basis of eigenvectors I've already solved it. The possible eigenvectors are ##(1,-3,0), (1,0,3), (\frac{1}{2}, \frac{1}{2},1) ##. Now the exercise wants me to show that there is no orthogonal basis of eigenvectors for this particular linear transformation.

    How do I show it?

    The exercise doesn't ask this, but what's the implication of eigenvectors forming an orthogonal basis?

    2. Relevant equations


    3. The attempt at a solution

    With the euclidean inner product I can clearly see that the eigenvectors are not orthogonal to each other. But I'm not sure if calculating many pairs of dot products is the way to show it.

    I thought about Gram-Schmidt but doing that would make the vectors not be eigenvectors anymore.
     
    Last edited: Feb 19, 2017
  2. jcsd
  3. Feb 19, 2017 #2

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    If you have got three eigenvectors, then either they are mutually orthogonal or they are not.
     
  4. Feb 19, 2017 #3

    StoneTemplePython

    User Avatar
    Gold Member

    I suppose you could do a Schur decomposition, and show that you get a matrix ##\mathbf{UTU^*}## where ##\mathbf{T}## has eigenvalues along the diagonal but it is upper triangular only (i.e. it is not both upper triangular and lower triangular -- i.e. not diagonal). Out of curiosity, what are the eigenvalues here?

    More importantly:

    Orthogonal (and unitary if complex) matrices are extremely pleasant to work with. From a geometric standpoint, they are length preserving. Consider some real valued vector ##\mathbf{x}## and an orhogonal matrix ##mathbf{U}##: ##||\mathbf{x}||_2^2 = \mathbf{x^Tx} = ||\mathbf{Ux}||_2^2 =\mathbf{x^TU^TUx} = \mathbf{x^TIx} = \mathbf{x^T x}##. Having mutually orthonormal eigenvectors is immensely useful in manipulating things like quadratic forms -- e.g. maximizing or minimizing ##\mathbf{x^T A x}##, which comes up all the time (e.g. Hessian matrix of second derivatives in calc).

    On top of all this, orthogonal matrices are very nice for preserving numeric stability if you're doing serious computational work.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Eigenvectors and orthogonal basis
  1. Orthogonal Basis (Replies: 4)

Loading...