Interesting theorem, complex eigenvalues.

Click For Summary

Discussion Overview

The discussion revolves around a theorem related to complex eigenvalues and the diagonalization of matrices. Participants explore methods to demonstrate the theorem, particularly focusing on the use of standard diagonalization techniques and the implications of real and complex matrices.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant introduces a theorem and seeks a demonstration using standard diagonalization, noting the role of complex eigenvalues and eigenvectors.
  • Another participant suggests checking the equality of matrices by expressing them in terms of their elements and applying definitions.
  • A third participant establishes a lemma regarding the equality of two real 2x2 matrices based on their action on a specific vector, leading into the proof of the theorem.
  • The proof involves showing that for a real matrix A with complex eigenvalues, certain relationships hold when applying the matrix to a vector formed from complex components.
  • A later reply expresses appreciation for the proof provided, indicating a positive reception of the argument presented.

Areas of Agreement / Disagreement

Participants generally agree on the validity of the proof presented, but the discussion does not resolve whether the initial theorem can be shown using ordinary diagonalization methods, as this remains a point of inquiry.

Contextual Notes

The discussion includes assumptions about the properties of matrices and eigenvalues, particularly regarding the nature of real and complex components. There are unresolved aspects concerning the application of the theorem in different contexts.

Who May Find This Useful

Readers interested in linear algebra, particularly those studying eigenvalues and eigenvectors, as well as those exploring the properties of complex matrices, may find this discussion relevant.

bobby2k
Messages
126
Reaction score
2
Take a look at this theorem.

theorem.png


Is it a way to show this theorem? I would like to show it using the standard way of diagonalizing a matrix.

I mean if P = [v1 v2] and D =
[lambda1 0
0 lambda D]

We have that AP = PD even for complex eigenvectors and eigenvalues.

But the P matrix in this theorem is real, and so is the C matrix. I think they have used that v1 and v2 are conjugates, and so is lambda 1 and lambda 2.

How would you show this theorem? Can you use ordinary diagonolisation to show it?
 
Last edited:
Physics news on Phys.org
I would suggest: check that AP= PC writing A with some matrix elements (say p,q,r,s) and using the definitions.
 
Before showing the theorem, I'll first establish the lemma that if you have two real 2x2 matrices, and if you multiply them on the right by \begin{bmatrix} 1 \\ i \end{bmatrix} and you get the same result for both, then the two original matrices are the same:

Let X= \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix} and Y= \begin{bmatrix} y_1 & y_2 \\ y_3 & y_4 \end{bmatrix}. Then if
\begin{align}X\begin{bmatrix} 1 \\ i \end{bmatrix} &amp;= Y\begin{bmatrix} 1 \\ i \end{bmatrix}, \\<br /> \begin{bmatrix} x_1 &amp; x_2 \\ x_3 &amp; x_4 \end{bmatrix}\begin{bmatrix} 1 \\ i \end{bmatrix} &amp;= \begin{bmatrix} y_1 &amp; y_2 \\ y_3 &amp; y_4 \end{bmatrix}\begin{bmatrix} 1 \\ i \end{bmatrix} \end{align} \\<br /> \begin{bmatrix} x_1 + x_2i \\ x_3 + x_4i \end{bmatrix} = \begin{bmatrix} y_1 + y_2i \\ y_3 + y_4i \end{bmatrix}
Hence x_1=y_1, x_2=y_2, x_3=y_3, x_4=y_4, and hence X=Y.

Now for the theorem itself:
Notice that P \begin{bmatrix} 1 \\ i \end{bmatrix} = {\bf v} and that
C\begin{bmatrix} 1 \\ i \end{bmatrix}=\begin{bmatrix} a-bi \\ b+ai \end{bmatrix}=\begin{bmatrix} \lambda \\ \lambda i \end{bmatrix} = \lambda\begin{bmatrix} 1 \\ i \end{bmatrix}.
Then, for any real matrix A with complex eigenvalues, we have
\begin{align}A{\bf v} &amp;= \lambda {\bf v} \\<br /> AP\begin{bmatrix} 1 \\ i \end{bmatrix} &amp;= \lambda P\begin{bmatrix} 1 \\ i \end{bmatrix} \\ &amp;= P \lambda \begin{bmatrix} 1 \\ i \end{bmatrix} \\ &amp;= PC \begin{bmatrix} 1 \\ i \end{bmatrix}\end{align}
But both AP and PC are real, hence
AP=PC \quad \text{or} \quad A=PCP^{-1}.
 
Very nice proof!, thank you very much!
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K