Linear Algebra: Diagonalization, Transpose, and Disctinct Eigenvectors.

  1. 1. The problem statement, all variables and given/known data

    Show that if an nxn matrix A has n linearly independent eigenvectors, then so does A^T

    3. The attempt at a solution

    Well, I understand the following:

    (1) A is diagonalizable.

    (2) A = PDP^-1, where P has columns of the independent eigenvectors

    (3) A is invertible, meaning it has linearly independent columns, and rows that are not scalar multiples of each other.

    (4) A^T has the same eigenvalues of A.


    So, based mainly on (4), I can prove the "easy" case, in which there are n distinct eigenvalues.

    IE: If A has n distinct eigenvalues, then A^T has those same distinct eigenvalues. Thus, If lambda_1 through lambda_n are distinct, then they each correspond to distinct eigenvectors v_1 through v_n for A and v_1T through v_nT for A^T.

    In this case, the eigenvectors could be the same (in the case that A=A^T), but don't have to be.


    My problem!

    What if the eigenvalues are not distinct, IE there is some lambda_i with multiplicity =k.

    I understand that in the case of A, the (A-lambda_i*I_n) matrix must have k free variables in order to give the (stated) n linearly independent eigenvectors.

    So, how can I prove that the repeated eigenvalues in the A^T case ALSO correspond to (A^T-lambda_i*I_n) having k free variables and thus n linearly independent eigenvectors?
     
  2. jcsd
  3. Hurkyl

    Hurkyl 16,090
    Staff Emeritus
    Science Advisor
    Gold Member

    Using just the equations you've written, and the elementary algebraic properties of the transpose, I bet you can diagonalize A^T, and thus get an algebraic formula for the its eigenvectors.
     
  4. I still can't piece the last bit together. I realize that A^T has linearly independent columns, and rows that aren't scalar multiples of each other. I also see that the Transpose has the same diagonal entries as the original. However, the systems I'm setting up with transpose to find Eigenvectors aren't the same at all. In cases where I try with numbers, they still work, which is pretty cool, but I can' seem to guarantee k free variables in the A-lamI of A^T
     
  5. I think I got it.

    A = PDP^(-1)
    A^T = (P^-1)^T * D^T * P^T
    D^T = D
    (P^-1)^T = (P^T)^-1

    A^T = (P^T)^-1 * D P^T

    Since P is invertible, it has linearly independent columns and so does P^T.

    So let P^T = M

    A^T = MDM^-1

    Therefore A^T is diagonalizable.

    Therefor A^T has n linearly independent eigenvectors (the columns of M).
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook