1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvectors and their inverses

  1. Jan 15, 2008 #1
    [SOLVED] Eigenvectors and their inverses

    1. The problem statement, all variables and given/known data
    After submitting the first question, I thought of a new one - so there are two questions:

    1) I have a n x n matrix A and it has n (not necessarily different) eigenvalues. I can write the matrix A as the product of:


    where D is the diagonalmatrix which has the eigenvalues as it's entries. S contains the eigenvalues (in the same order as they are written in D) and S^-1 is the inverse of S.

    Some places I see they write it as S^(-1)*D*S, and some places as S*D*S^(-1). Is it always the matrix to the left of D that contains the eigenvectors?

    2) If I have a matrix A that represents a transformation L from R^4 -> R given by [1 -1 3 0], then how can I determine if L is linear from A?

    Thanks in advance,

    sincerely Niles.
    Last edited: Jan 15, 2008
  2. jcsd
  3. Jan 15, 2008 #2
    Comment on 1)
    If A is nxn and it does not have n distint eigenvalues (as you are saying) then you cannot diagonalize it. This is a theorem.

    For instance if you have a matrix (A B; C D) and it has eigenvalues E, F then you can diagonalise it (assume you find eigenvector). The diagonzable matrix would be of form (E 0; 0 F);

    For 2)
    Use definition of linear transformation: closed under scalar addition and multiplication. If the results are the same then it is a linear transformation.


  4. Jan 15, 2008 #3


    User Avatar
    Staff Emeritus
    Science Advisor

    I was a bit puzzled by "eigenvectors" and their inverses"!

    S contains the eigenvectors corresponding to the eigenvalues of D, in that order.
    Also, those eigenvectors can be chosen so that S is orthonormal.

    Yes, it is the left matrix, S, in SDS-1= A, that contains the eigenvectors of A as columns. S-1 then contains eigenvectors of A as rows- though not necessairily the same eigenvectors. Of course, if the eigenvectors are chosen so that S is orthonormal, then S= S-1.

    ?? Matrix multiplication is always linear and so determines a linear transformation. A transformation is linear if and only if it can be written as a matrix.

    Last edited: Jan 15, 2008
  5. Jan 15, 2008 #4
    First, thanks to both of you.

    Mr. HallsofIvy:

    "Also, those eigenvectors can be chosen so that S is orthonormal." - that is only if A is symmetric, and by using Gram-Schmidt on the eigenvectors belonging to distinct eigenvalues, right? I believe that's what you told me in a previous post.

    In S^(-1)*D*S = A, is it the right matrix, S, that containts the eigenvectors as columns?
    Last edited: Jan 15, 2008
  6. Jan 15, 2008 #5


    User Avatar
    Staff Emeritus
    Science Advisor

    No, there is no such theorem. An nxn matrix is diagonalizable if and only if there exist a "complete set" of eigenvectors- n independent eigenvectors. It does not matter if the eignvalues are distinct- as long as there are enough independent eigenvectors corresponding to the multiple eigenvalue. Then we can choose those eigenvectors as basis for the space and the matrix, in that basis, is diagonal. For example, the matrix
    [tex]\left[\begin{array}{ccc} 1 & 0 & 0 \\ 0 & 1 & 0\\ 0 & 0 & 1\end{array}\right][/tex]
    is trivially "diagonalizable" but has only 1 as eigenvalue. Of course, the independent vectors <1, 0, 0>, <0, 1, 0> and <0, 0, 1> are all eigenvectors corresponding to eigenvalue 1.

    But the question was about whether the matrix, S, "containing" the eigenvectors (I assume that means "having the eigenvectors as columns") should be on the left or right: is it A= SDS-1 or A= S-1DS- and you don't answer that.

    And, of course, a matrix multiplication always does satisfy those.

  7. Jan 15, 2008 #6

    D H

    Staff: Mentor

    If S is orthonormal then S-1=ST, not S.
  8. Jan 15, 2008 #7


    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, A must be symmetric. I am so used to working with symmetric matrices (the easy situation!) that I didn't think. Sorry.

    No, I thought I said that it had to be the left matrix. Of course, a more common problem is, given A, find D. In that case, multplying SDS-1= A on the left by S-1 and on the right by S, we get D= S-1AS. Going that way, S is "on the right" and S-1 is on the left.
  9. Jan 15, 2008 #8


    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, I meant to say "orthogonal" or "Hermitian".
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Eigenvectors and their inverses
  1. Eigenvector sign (Replies: 2)

  2. Eigenvector proof (Replies: 4)

  3. Eigenvector Woes (Replies: 3)