Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Interesting theorem, complex eigenvalues.

  1. Jul 28, 2013 #1
    Take a look at this theorem.

    theorem.png

    Is it a way to show this theorem? I would like to show it using the standard way of diagonalizing a matrix.

    I mean if P = [v1 v2] and D =
    [lambda1 0
    0 lambda D]

    We have that AP = PD even for complex eigenvectors and eigenvalues.

    But the P matrix in this theorem is real, and so is the C matrix. I think they have used that v1 and v2 are conjugates, and so is lambda 1 and lambda 2.

    How would you show this theorem? Can you use ordinary diagonolisation to show it?
     
    Last edited: Jul 28, 2013
  2. jcsd
  3. Jul 29, 2013 #2
    I would suggest: check that AP= PC writing A with some matrix elements (say p,q,r,s) and using the definitions.
     
  4. Jul 29, 2013 #3
    Before showing the theorem, I'll first establish the lemma that if you have two real 2x2 matrices, and if you multiply them on the right by [itex]\begin{bmatrix} 1 \\ i \end{bmatrix}[/itex] and you get the same result for both, then the two original matrices are the same:

    Let [itex] X= \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix} [/itex] and [itex] Y= \begin{bmatrix} y_1 & y_2 \\ y_3 & y_4 \end{bmatrix} [/itex]. Then if
    [tex] \begin{align}X\begin{bmatrix} 1 \\ i \end{bmatrix} &= Y\begin{bmatrix} 1 \\ i \end{bmatrix}, \\
    \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix}\begin{bmatrix} 1 \\ i \end{bmatrix} &= \begin{bmatrix} y_1 & y_2 \\ y_3 & y_4 \end{bmatrix}\begin{bmatrix} 1 \\ i \end{bmatrix} \end{align} \\
    \begin{bmatrix} x_1 + x_2i \\ x_3 + x_4i \end{bmatrix} = \begin{bmatrix} y_1 + y_2i \\ y_3 + y_4i \end{bmatrix} [/tex]
    Hence [itex]x_1=y_1[/itex], [itex]x_2=y_2[/itex], [itex]x_3=y_3[/itex], [itex]x_4=y_4[/itex], and hence [itex]X=Y[/itex].

    Now for the theorem itself:
    Notice that [itex]P \begin{bmatrix} 1 \\ i \end{bmatrix} = {\bf v}[/itex] and that
    [tex]C\begin{bmatrix} 1 \\ i \end{bmatrix}=\begin{bmatrix} a-bi \\ b+ai \end{bmatrix}=\begin{bmatrix} \lambda \\ \lambda i \end{bmatrix} = \lambda\begin{bmatrix} 1 \\ i \end{bmatrix}.[/tex]
    Then, for any real matrix [itex]A[/itex] with complex eigenvalues, we have
    [tex]\begin{align}A{\bf v} &= \lambda {\bf v} \\
    AP\begin{bmatrix} 1 \\ i \end{bmatrix} &= \lambda P\begin{bmatrix} 1 \\ i \end{bmatrix} \\ &= P \lambda \begin{bmatrix} 1 \\ i \end{bmatrix} \\ &= PC \begin{bmatrix} 1 \\ i \end{bmatrix}\end{align}[/tex]
    But both [itex]AP[/itex] and [itex]PC[/itex] are real, hence
    [tex]AP=PC \quad \text{or} \quad A=PCP^{-1}.[/tex]
     
  5. Jul 30, 2013 #4
    Very nice proof!, thank you very much!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Interesting theorem, complex eigenvalues.
  1. Eigenvalue theorem (Replies: 5)

Loading...