1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Using diagonalization, prove the matrix equals it's square

  1. Nov 26, 2011 #1
    1. The problem statement, all variables and given/known data
    Suppose that A is a 2x2 matrix with eigenvalues 0 and 1. Using diagonalization, show that A2 = A


    3. The attempt at a solution
    Let [tex]A=\begin{pmatrix}a&b\\c&d\end{pmatrix}[/tex]

    Av=λv where [tex]v=\begin{pmatrix}x\\y\end{pmatrix}[/tex] and x,y≠0
    If λ=0 then [tex]ax+by=0[/tex] and [tex]cx+dy=0[/tex]
    If λ=1 then [tex]ax+by=1[/tex] and [tex]cx+dy=1[/tex]

    so Av-λv=0, then Av-λIv=0, then (A-λI)v=0. Since v≠0, then (A-λI)=0
    so for λ=0 [tex]\begin{pmatrix}a&b\\c&d\end{pmatrix}[/tex] and [tex]ax+by=0[/tex] and [tex]cx+dy=0[/tex]
    For λ=1 [tex]\begin{pmatrix}a-1&b\\c&d-1\end{pmatrix}[/tex] and [tex]ax-x+by=1[/tex] and [tex]cx+dy-y=1[/tex]

    We must find two lin. ind. vectors such that we can create X where the first column of X is the first vector, and the second column of X is the second vector.

    [tex]X^{-1}AX= \begin{pmatrix}0&0\\0&1\end{pmatrix}[/tex]

    If this is true, then [tex]X^{-1}A^{2}X= \begin{pmatrix}0&0\\0&1\end{pmatrix}[/tex]

    The problem is, I'm not quite sure how to prove any of this
     
  2. jcsd
  3. Nov 26, 2011 #2

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    You don't need to specifically find your matrix X. Only knowing that the matrix X exists would suffice. So the only thing you need is that there exists a matrix X such that

    [tex]XAX^{-1}[/tex]

    is diagonal. You don't need the specific form of X.
     
  4. Nov 26, 2011 #3
    In order for the matrix X to exist, it must be invertible (it must be nxn) and since it must be able to multiply by A (2x2), X must also be 2x2.

    Since there are two distinct eigenvalues, there must be two linearly independant eigenvectors. These two distinct eigenvectors can for X.

    I'm still a bit confused on how to prove exactly why there must be two linearly independent eigenvectors.
     
  5. Nov 26, 2011 #4

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    You can prove this directly. If v and w are eigenvectors belonging to distinct eigenvalues, then v and w are independent. Thus if [itex]Av=\lambda v[/itex] and if [itex]Aw=\mu w[/itex] and if [itex]\lambda =\mu[/itex], then v and w are independent.

    Try to prove this. Prove it by contradiction. Assume that v and w are dependent. What do you know then??
     
  6. Nov 26, 2011 #5

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    You mean "if [itex]\lambda\ne \mu[/itex]".

     
  7. Nov 27, 2011 #6
    If v and w are dependent, then w=cv which could be put into [itex]Av=\lambda v[/itex] and [itex]Aw=\mu cv[/itex] so [itex]\lambda =\mu c[/itex] and they are dependent. But 0 and 1 are independent, so this is a contradiction.
     
  8. Nov 27, 2011 #7

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    What do you mean, 0 and 1 are independent?? 0 and 1 are numbers, not vectors. Saying that they are independent makes no sense.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Using diagonalization, prove the matrix equals it's square
Loading...