1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

EigenValues & EigenVectors proofs

  1. Jan 7, 2008 #1
    Question 1:
    Proove that if λ is an eigenvalue of [A], then 1/λ is an eigenvalue of [A][text]{T}[/text]


    Question 2
    Proove that a square matrices [A] and [A]T have the same Eigenvalues.


    Question 3:
    Show that |det(A)| is the product of the absolute values of the eigenvalues of
    [A]

    Question 4:
    Let A and B be nonsingular nxn matrices. show that AB^-1 and B^-1A have the same eigenvalues.
     
  2. jcsd
  3. Jan 7, 2008 #2
    I'll do the first one for you. I couldn't read what exactly you asked on the first one. Were you trying to prove that if the eigenvalues of [tex]A[/tex] were [tex]\lambda _i[/tex] then the eigenvalues of [tex]A^{ - 1}[/tex] are [tex]\frac{1}{{\lambda _i }}[/tex]. I'll prove that. By the definition of an inverse matrix [tex]AA^{-1}=A^{-1}A=1[/tex]. Now, for every non singular matrix, [tex]\det A \ne 0[/tex], you can diagonalize it with its eigenvectors [tex]A=PDP^{-1}[/tex]. Where

    [tex]
    D = \left( {\begin{array}{*{20}c}
    {\lambda _1 } & \cdots & 0 \\
    \vdots & \ddots & \vdots \\
    0 & \cdots & {\lambda _n } \\

    \end{array} } \right)
    [/tex]


    From this it should be obvious that [tex]A^{-1}=PD^{-1}P^{-1}[/tex]. So [tex]A^{-1}A=PD^{-1}DP^{-1}=1[/tex] and [tex]AA^{-1}=PDD^{-1}P^{-1}=1[/tex]. So [tex]DD^{-1}=1[/tex] and [tex]D^{-1}D=1[/tex]. Multiply [tex]D[/tex] on the left and right side by

    [tex]\[
    B = \left( {\begin{array}{*{20}c}
    {\frac{1}
    {{\lambda _1 }}} & \cdots & 0 \\
    \vdots & \ddots & \vdots \\
    0 & \cdots & {\frac{1}
    {{\lambda _n }}} \\

    \end{array} } \right)
    [/tex]

    and you will get [tex]1[/tex] so

    [tex]\[
    D^{-1} = \left( {\begin{array}{*{20}c}
    {\frac{1}
    {{\lambda _1 }}} & \cdots & 0 \\
    \vdots & \ddots & \vdots \\
    0 & \cdots & {\frac{1}
    {{\lambda _n }}} \\

    \end{array} } \right)
    [/tex] and this proves that the eigenvalues of [tex]A^{-1}[/tex] are [tex]\frac{1}{{\lambda _n }}[/tex]
     
  4. Jan 7, 2008 #3

    Defennder

    User Avatar
    Homework Helper

    For question 2 you'll need to use these properties:
    [tex](A-B)^T = A^T - B^T [/tex]
    [tex]det A^T = det A[/tex]

    Question 3 is interesting. Never seen anything like it. No idea how to do it at present.

    For question 4, you start with [tex]B^{-1}Ax = \lambda x \\[/tex]
    Then you multiply by [tex]A[/tex] on the left on both sides of the equation. What do you notice?
     
  5. Jan 7, 2008 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    First, this is clearly homework and so I'm going to move it to the "Calculus and Beyond" homework section. Second, no work was shown at all and Jim Kata should not have given the complete solution. Fortunately the proof he gave was far simply incorrect! In particular, it is NOT true that "every non singular matrix, A, you can diagonalize it with its eigenvectors". There exist many non-singular matrices that cannot be diagonalized.

    In fact, the first problem should include the condition "[itex]\lambd \ne 0[/itex]" to be true (that is implied in the use of [itex]1/\lambda[/itex] but it should have been said.
    There is a very simple proof that if [itex]Ax= \lambda x[/itex] (and [itex]\lambda \ne 0[/itex]) then [itex]A^T x= (1/\lambda) x[/itex] that does not use "diagonal matrices" etc. but just the fact that [itex]A^TA= I[/itex].
     
    Last edited: Jan 7, 2008
  6. Jan 7, 2008 #5
    Special thanks to Defennnder and Jim Kata for your contributions. HallsofIvy, I am not asking for people to do my homework for me, The first 3 exercises are from some notes I downloaded from the internet, the fourth question is from the book "Introductory Linear Algebra with Applications" by Bernard Kolman and it's not home work as you think.

    I did not post my solution for the sake of keeping the post short and as you can see from my post,I am strugling to format the mathematical notations.

    I will appreciate if u could point me toward theorems or give clues that will lead to the solution. In your 2 replies to my posts, it's sounds like you are rebuking me, and that is not needed.
     
  7. Jan 7, 2008 #6

    malawi_glenn

    User Avatar
    Science Advisor
    Homework Helper

    Jim Kata and Bertrandkis: read and follow the rules, it is sipmle, just as HallsofIvy said.
     
  8. Jan 7, 2008 #7

    CompuChip

    User Avatar
    Science Advisor
    Homework Helper

    Is the fourth question even true?
    In any case it is true that
    [tex] B^{-1} A v = A B^{-1} v + [B^{-1}, A] v = \lambda v + [B^{-1}, A] v[/tex]
    if all the matrix products exist, where [A, B] = A B - B A denotes the commutator, so in first instance I'd only expect the statement to be true if A and B^(-1) commute or any eigenvector of B^{-1}A is also an eigenvector of the commutator.
     
  9. Jan 7, 2008 #8

    Defennder

    User Avatar
    Homework Helper

    What's a commutator? Doesn't [tex]AB^{-1}(Ax) = \lambda(Ax) [/tex] show they have the same eigenvalues?
     
  10. Jan 8, 2008 #9
    In doing such aren't you assuming [tex]{\mathbf{A}} \in O\left( n \right)[/tex]. I didn't see any mention of that in the question. I think it's just saying [tex]{\mathbf{A}}\in Gl\left( n \right)[/tex]. I guess I was wrong in assuming that all non singular matrices are diagonalizable by their eigenvectors, but since the question explicitly mentions eigenvalues in the question I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors.
     
  11. Jan 8, 2008 #10

    morphism

    User Avatar
    Science Advisor
    Homework Helper

    For 3, it's not too hard to prove that |det(A)| is the absolute value of the constant term of the characteristic polynomial of A. Why does this help?

    For 4, we can note that AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. Using this, one can prove that AB-1 and B-1A have the same characteristic polynomial.
     
  12. Jan 8, 2008 #11

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    True, I misread the problem and then confused AT with A-1!

    As for your last statement, "I think it's fair to assume that in this case the matrix is diagonalizable by its eigenvectors", no, it's not. Just having eigenvectors is not enough. In order to be diagonalizable, a matrix must have a complete set of eigenvectors. That is, a basis for the space consisting entirely of eigenvectors. A diagonal matrix has its diagonal numbers as eigenvalues, each corresponding to the basis vectors (1, 0, 0, ...), (0, 1, 0, ...) etc. If a matrix has two or more duplicate eigenvalues, It may or may not have two (or more) independent eigenvectors corresponding to that eigenvalue.

    A simple example is
    [tex]\left[\begin{array}{cc} 1 & 1 \\ 0 & 1 \end{array}\right][/tex]

    It has a "double" eigenvalue, 1, corresponding to eigenvalue (1, 0) (and multiples of that). Since there is no other independent eigenvector, it is not diagonalizable. Obviously, if we could write it as a diagonal matrix, it would have to be
    [tex]\left[\begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array}\right][/tex]
    and that is obviously not equivalent to
    [tex]\left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right][/tex]
     
  13. Jan 9, 2008 #12
    Morphism, your explanation for question 4:
    AB-1 - kI = BB-1AB-1 - kBB-1 = B(B-1A - k)B-1. has confused me.
    As far as I know, to prove that AB^-1 and B^-1A have the same eigenvalues one has to show that they have the same characteristic polynomial viz
    det([tex]\lambda[/tex]I - AB[tex]^{-1}[/tex])=det([tex]\lambda[/tex]I - B[tex]^{-1}[/tex]A). This is where I get stuck.

    I tried to follow the suggestion by defennnder, starting from [tex]B^{-1}Ax = \lambda x \\[/tex]
    multiply by A on the left on both sides of the equation,
    [tex]A B^{-1}Ax =A \lambda x \\[/tex] then [tex]A B^{-1}Ax =\lambda A x \\[/tex]
    this yields [tex](A B^{-1})Ax =\lambda A x \\[/tex] and it can be concluded that
    [tex]A B^{-1}=\lambda \\[/tex]. This is not the desired conclusion.

    Is there anyone out there who can give a black and white proof of this?
     
  14. Jan 9, 2008 #13

    Defennder

    User Avatar
    Homework Helper

    You've got it right up to here.


    No, this doesn't follow. [tex]AB^{-1}[/tex] is a nxn matrix, whereas [tex]\lambda[/tex] is a constant, an eigenvalue. In general, you cannot simply cancel Ax on both sides of the matrix equation. This is matrix algebra, not algebraic manipulation. Furthermore, now note that your equation [tex]A B^{-1}Ax =\lambda A x \\[/tex] shows that [tex]AB^{-1}[/tex] has [tex]\lambda[/tex] as an eigenvalue but Ax, rather than x as an associated eigenvector.
     
  15. Jan 9, 2008 #14
    Thanks Defennnder, Now I am with you. Thank you for your help.
     
  16. Jan 9, 2008 #15

    morphism

    User Avatar
    Science Advisor
    Homework Helper

    Use the fact that det(AB)=det(A)det(B).
     
  17. Jan 9, 2008 #16

    Defennder

    User Avatar
    Homework Helper

    How do you start to prove that? Evaluating determinants by co-factor expansions for nxn matrices become very messy.
     
  18. Jan 9, 2008 #17

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    The characteristic polynomial is det(A-xI). Put x=0. You don't have to cofactor anything.
     
  19. Jan 9, 2008 #18

    Defennder

    User Avatar
    Homework Helper

    Isn't the characteristic polynomial [tex]det(\lambda I -A)[/tex]? And how is possible to put x=0 (I'm assuming you mean x as a constant, not a column vector) if 0 is not an eigenvalue of A, which holds if A is invertible?
     
  20. Jan 10, 2008 #19

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Are you serious? Dick wrote det(A- xI) and you wrote det(A- [itex]\lambda[/itex] I). Those are exactly the same thing with x= [itex]\lambda[/itex]. The characteristic polynomial for any matrix is det(A- xI) which clearly has "constant term" det(A). The eigenvalues, [itex]\lambda_1[/itex], [itex]\lambda_2[/itex], etc. are the solutions to the equation det(A- xI)= 0.
     
  21. Jan 10, 2008 #20

    Defennder

    User Avatar
    Homework Helper

    Well actually that's not what I meant, but anyway I think I got morphism's point now. I just can't quite figure out why the absolute value of the constant value of characteristic polynomial of A implies that it is the absolute value of the eigenvalues of A.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: EigenValues & EigenVectors proofs
Loading...