Proving Similarity of Non-Diagonalizable Matrices

  • Thread starter Thread starter Lily@pie
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
To prove the similarity of a non-diagonalizable 2x2 matrix A to the form [[Ω,1],[0,Ω]], it is established that A and the matrix L share the same eigenvalue, Ω. The discussion emphasizes that since A is not diagonalizable, it cannot have two distinct eigenvalues, leading to the conclusion that both matrices must have the same eigenvalue structure. The participants explore the implications of generalized eigenvectors and the Jordan normal form, noting that while direct use of the Jordan form is not permitted, understanding its principles is essential for proving similarity. The conversation also touches on the need to demonstrate the existence of a linearly independent vector w that, along with the eigenvector v, forms a basis for R², ultimately confirming the similarity condition. The thread concludes that while the proof is complex, it hinges on establishing the relationship between the eigenvalues and the ranks of the corresponding eigenspaces.
  • #31
Sorry, it's quiz time right now! :wink:

If you're interested look for "Quiz and Trivia" in:
https://www.physicsforums.com/misc.php?do=flashchat&room=1
 
Last edited by a moderator:
Physics news on Phys.org
  • #32
Lily@pie said:
But having (A-ΩI)w=0 means w is an eigenvector for A. But we know it cannot be 2 distinct eigenvectors, so w can only be a multiple of v. Which shows that v and w are linearly dependent.
Right. This is why you can't have (A-ΩI)w=0.
But we would hope to show v and w are linearly independent.

(A-ΩI)w≠0 mean (A-ΩI)w is an eigenvector for A?? So w and v are linearly independent??
don't understand ><
The only way to satisfy (A-ΩI)2w=0 is therefore (A-ΩI)w=v. Note that you've already shown that w can't be a multiple of v; therefore, w and v are independent.
Besides that writting P-1(A-ΩI)P where P=[v w] means writing the eigenvalues in the diagonal entry??
Among other things, yes, but that's not the whole story. You want to think about how you find the columns of a matrix for a linear transformation relative to the basis {v, w}.
 
  • #33
Thank you both for visiting the quiz! :)

I have great news!
I won the quiz! YAY! :smile: :smile: :smile:

(Last time I finished last! :cry:)
 
  • #34
oh... okay... thanks so much for the help... ^^
 

Similar threads

Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
2K
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K