Interesting theorem, complex eigenvalues.

In summary, the conversation discusses a theorem regarding diagonalization of a matrix and how to show it using standard methods. The conversation also establishes a lemma and uses it to prove the theorem. Ultimately, it is shown that for any real matrix A with complex eigenvalues, AP=PC or A=PCP^-1.
  • #1
bobby2k
127
2
Take a look at this theorem.

theorem.png


Is it a way to show this theorem? I would like to show it using the standard way of diagonalizing a matrix.

I mean if P = [v1 v2] and D =
[lambda1 0
0 lambda D]

We have that AP = PD even for complex eigenvectors and eigenvalues.

But the P matrix in this theorem is real, and so is the C matrix. I think they have used that v1 and v2 are conjugates, and so is lambda 1 and lambda 2.

How would you show this theorem? Can you use ordinary diagonolisation to show it?
 
Last edited:
Physics news on Phys.org
  • #2
I would suggest: check that AP= PC writing A with some matrix elements (say p,q,r,s) and using the definitions.
 
  • #3
Before showing the theorem, I'll first establish the lemma that if you have two real 2x2 matrices, and if you multiply them on the right by [itex]\begin{bmatrix} 1 \\ i \end{bmatrix}[/itex] and you get the same result for both, then the two original matrices are the same:

Let [itex] X= \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix} [/itex] and [itex] Y= \begin{bmatrix} y_1 & y_2 \\ y_3 & y_4 \end{bmatrix} [/itex]. Then if
[tex] \begin{align}X\begin{bmatrix} 1 \\ i \end{bmatrix} &= Y\begin{bmatrix} 1 \\ i \end{bmatrix}, \\
\begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix}\begin{bmatrix} 1 \\ i \end{bmatrix} &= \begin{bmatrix} y_1 & y_2 \\ y_3 & y_4 \end{bmatrix}\begin{bmatrix} 1 \\ i \end{bmatrix} \end{align} \\
\begin{bmatrix} x_1 + x_2i \\ x_3 + x_4i \end{bmatrix} = \begin{bmatrix} y_1 + y_2i \\ y_3 + y_4i \end{bmatrix} [/tex]
Hence [itex]x_1=y_1[/itex], [itex]x_2=y_2[/itex], [itex]x_3=y_3[/itex], [itex]x_4=y_4[/itex], and hence [itex]X=Y[/itex].

Now for the theorem itself:
Notice that [itex]P \begin{bmatrix} 1 \\ i \end{bmatrix} = {\bf v}[/itex] and that
[tex]C\begin{bmatrix} 1 \\ i \end{bmatrix}=\begin{bmatrix} a-bi \\ b+ai \end{bmatrix}=\begin{bmatrix} \lambda \\ \lambda i \end{bmatrix} = \lambda\begin{bmatrix} 1 \\ i \end{bmatrix}.[/tex]
Then, for any real matrix [itex]A[/itex] with complex eigenvalues, we have
[tex]\begin{align}A{\bf v} &= \lambda {\bf v} \\
AP\begin{bmatrix} 1 \\ i \end{bmatrix} &= \lambda P\begin{bmatrix} 1 \\ i \end{bmatrix} \\ &= P \lambda \begin{bmatrix} 1 \\ i \end{bmatrix} \\ &= PC \begin{bmatrix} 1 \\ i \end{bmatrix}\end{align}[/tex]
But both [itex]AP[/itex] and [itex]PC[/itex] are real, hence
[tex]AP=PC \quad \text{or} \quad A=PCP^{-1}.[/tex]
 
  • #4
Very nice proof!, thank you very much!
 
  • #5



Yes, it is possible to show this theorem using the standard method of diagonalizing a matrix. The key is to recognize that the complex eigenvalues and eigenvectors can still be represented in terms of real numbers, as you have mentioned with the conjugates. This allows us to use the same diagonalization process as we would for real eigenvalues and eigenvectors.

To show this, we can start with the equation AP = PD, where A is our original matrix, P is the matrix of eigenvectors, and D is the diagonal matrix with the eigenvalues on the diagonal. Since we are dealing with complex eigenvalues and eigenvectors, we can rewrite this equation as AP = PDC, where C is a diagonal matrix with the complex conjugates of the eigenvalues on the diagonal.

Next, we can multiply both sides by the inverse of P to get A = PDCP^-1. We can then use the fact that P^-1 = P* (the conjugate transpose of P) to rewrite this as A = PDCP*. This is where the fact that the eigenvectors and eigenvalues are conjugates comes into play. Since P is a real matrix, its conjugate transpose is simply its transpose. This means that P* = P^T, and we can rewrite the equation as A = PDCP^T.

Now, since P is a matrix of eigenvectors, its columns are linearly independent. This means that P^T is invertible, and we can multiply both sides by P^T to get AP^T = PDC. This can then be rewritten as AP^T = PD, which is the same as AP = PD since P^T is invertible.

Therefore, we have shown that for complex eigenvalues and eigenvectors, AP = PD still holds, and the standard method of diagonalizing a matrix can be used to prove this theorem.
 

1. What is an interesting theorem about complex eigenvalues?

One interesting theorem about complex eigenvalues is the Schur decomposition theorem, which states that any square matrix can be decomposed into a triangular matrix and a unitary matrix.

2. What are complex eigenvalues?

Complex eigenvalues are solutions to the characteristic equation of a matrix, which is a polynomial equation of the form det(A-λI)=0, where A is the matrix and λ is the eigenvalue. They are complex numbers with a real part and an imaginary part.

3. How are complex eigenvalues different from real eigenvalues?

Complex eigenvalues are different from real eigenvalues because they involve imaginary numbers, while real eigenvalues only involve real numbers. Complex eigenvalues also represent more complex and non-physical solutions to a system, while real eigenvalues represent physical and stable solutions.

4. What are some applications of complex eigenvalues?

Complex eigenvalues have many applications in fields such as physics, engineering, and computer science. Some examples include using them to solve differential equations, analyzing stability in dynamical systems, and performing image compression in signal processing.

5. How are complex eigenvalues calculated?

Complex eigenvalues can be calculated using various methods, such as the characteristic equation, the power method, or the QR algorithm. These methods involve finding the roots of a polynomial equation or using iterative techniques to approximate the eigenvalues.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
601
  • Linear and Abstract Algebra
Replies
3
Views
935
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
4
Views
2K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
2K
Back
Top