1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Small proof

  1. Jun 17, 2009 #1
    1. The problem statement, all variables and given/known data

    Show that A and AT share the same eigenvalue.

    2. Relevant equations



    3. The attempt at a solution
    let v be the eigenvector
    Av=Icv
    since ATv=ITcv
    and IT=I,
    ATv=Icv
    so ATv=Icv=Av
    so A and AT must have the same eigenvalue.
     
    Last edited: Jun 17, 2009
  2. jcsd
  3. Jun 17, 2009 #2

    Cyosis

    User Avatar
    Homework Helper

    How did you go from the first to the second step?
     
  4. Jun 17, 2009 #3
    If Av=Icv, then the transpose of Icv is Icv since the transpose of I is I.
    Now (Icv)T=(Acv)T since Icv=Av
    since neither c nor v are matrices, they get pushed aside to
    get ITcv=ATv
     
  5. Jun 17, 2009 #4

    Cyosis

    User Avatar
    Homework Helper

    Yes, but a column vector is an mx1-matrix. When you transpose an mx1 matrix you get a 1xm matrix, called a row vector. My point is v and v transposed aren't exactly the same.
     
  6. Jun 17, 2009 #5
    Oh okay. So (Av)T= (Ivc)T=(Ic)T(v)T=c(I)T(v)T=cI(v)T which has the same eigenvalue
    as A, so AT and A both have the same eigenvalue.
     
  7. Jun 17, 2009 #6

    Cyosis

    User Avatar
    Homework Helper

    I don't quite see how you drew your conclusion. [itex](Av)^T=v^T A^T[/itex]. Just try to compute [itex]I v^T[/itex] with a small vector, you will notice they are incompatible.
     
  8. Jun 17, 2009 #7
    ok how about this: don't factor out vT keep it this way:
    since Av=cIv, (Av)T=(cIv)T and
    factor out c to get c(Iv)T which can be transposed.
    wait, never mind....I'm fixing this.
     
  9. Jun 17, 2009 #8

    Cyosis

    User Avatar
    Homework Helper

    Sure, but how do you conclude from that, that A^T has eigenvalue c?
     
  10. Jun 17, 2009 #9
    Oh that's right. Let me reprove this. Let c be at position j,j on A. Transposing
    the matrix lets c remain on j,j. So multiplying this by
    the eigenvector v gives the column matrix Icv. This gives
    the same result as not transposing A in the first place, since c will
    still remain on j,j no matter what, and as a result, it would wind up at position 1,j on the
    column matrix. Now, since Av=ATv,
    c must be in both matrices A and AT.
    Take note that we know that eigenvalues are found on the diagonals of
    invertible matrices, which is why j,j was chosen.
     
    Last edited: Jun 17, 2009
  11. Jun 17, 2009 #10

    Cyosis

    User Avatar
    Homework Helper

    The matrix

    [tex]
    \left( \begin{matrix} 1&1 \\ 1&1 \end{matrix} \right)
    [/tex]

    Has eigenvalues 2 and 0, neither of them are shown on the diagonal.

    I would personally use the inner product to prove this.
     
    Last edited: Jun 17, 2009
  12. Jun 17, 2009 #11

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You could also think about using det(A)=det(A^T).
     
  13. Jun 17, 2009 #12
    Oh, one thing: I am using the Axler book as a main book, and he covers eigenvectors
    before inner products, and everything before determinants, so I can't apply those,
    but I do appreciate your suggestions. But given cyosis's matrix, it makes sense that
    all matrices with entries that equal each other also have eigenvalues besides 0, I messed
    up there because I didn't consider that.
    So I'll split my proof into 3 cases. Case 1: A is an invertible matrix. Case 2: A is a matrix
    where all entries are the same, Case 3: A is noninvertible and not all of its entries are the
    same. Is this a good idea? I would use determinants or inner products if I knew what they were. If there is no other way, I won't do this proof until
    later on when I know more about them. But thank you for the suggestions.

    Take note that when I feel I need to practice a bit more I use other sources, but they vary in topic sequence.
     
    Last edited: Jun 17, 2009
  14. Jun 17, 2009 #13

    Cyosis

    User Avatar
    Homework Helper

    I find it hard to believe you are studying eigenvectors/eigenvalues yet have not been exposed to determinants. This would mean you are not capable of calculating eigenvalues of matrices yet, but your posts suggest otherwise. Could you show me perhaps how to calculate the eigenvalues of the matrix I posted earlier?
     
  15. Jun 17, 2009 #14

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You could also work around not having determinants yet. If c is an eigenvalue, then X=A-c*I is a singular matrix. Can you prove X is singular iff X^(T) is singular? How would that prove what you want to prove?
     
  16. Jun 17, 2009 #15
    I can see three justifications for X to be singular. First, if A=cI, then X is a zero
    matrix, and 0 is an eigenvalue for every vector, and X is singular for being
    a zero matrix.

    Second, if A-cI is nonzero but singular, then as long as the number of rows
    the column matrix (representing a vector) equals the number of columns
    in A and cI, Av would equal cIv so Av-cIv=0

    Now, if A-cI were to be nonsingular, then (A-cI)v would not be zero,
    so we would have Av=/=cIv so in this case v is not an eigenvector of A.

    I know this isn't the actual proof, just want to see if I understand the
    significance of X being singular.
     
    Last edited: Jun 17, 2009
  17. Jun 17, 2009 #16

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Singular doesn't mean A-cI equals 0. It means there is a nonzero vector v such that (A-cI)v=0. Which is what your second and third argument correctly say. You are overcomplicating this already. The question you should be thinking about, instead of trying to find multiple ways to prove the obvious, is why if X=A-cI is singular is X^T also singular?
     
    Last edited: Jun 17, 2009
  18. Jun 17, 2009 #17
    X=A-cI a jxn matrix with j=/=n.

    Consider the column with elements aj,1 to aj,n.
    If j>n then (at least) this column (lets call it column A) becomes a zero column after row reducing
    which results in a matrix with less columns than X so X is singular.
    Now transposing causes column A to become a row,
    with column A being the one at the bottom. Row reducing X^T
    gives a matrix with at least one less row (column A becomes a 0 row) than X^T since t so X^T is singular.
    This is a result of A being one of (or the only) the extra columns, so it becomes one of (or the only)
    extra rows that get "eliminated" after row reducing.
     
    Last edited: Jun 17, 2009
  19. Jun 17, 2009 #18

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You are thinking in the right direction. You proved row rank=column rank, right? So in the nxn case if rank is <n then both matrices are singular. You really don't have to think about any other case. If the matrix isn't square then the concept of an 'eigenvector' doesn't exist. Why not?
     
  20. Jun 17, 2009 #19

    because I doesn't exist.
     
  21. Jun 17, 2009 #20
    :biggrin: I be thanks you
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook