Similarity Transformation of Matrices: Decide w/o Eigenvectors

  • Context: Graduate 
  • Thread starter Thread starter ercan
  • Start date Start date
  • Tags Tags
    Matrices
Click For Summary
SUMMARY

The discussion focuses on determining the similarity of two matrices without using eigenvectors. It highlights that while matrices A and B share the same characteristic polynomial, they are not similar due to differing Jordan block structures. Specifically, matrix A has multiple Jordan blocks for the double eigenvalue 1, whereas matrix B does not. The minimal polynomial serves as a crucial tool in this analysis, providing insights into the Jordan normal form without requiring eigenvector calculations.

PREREQUISITES
  • Understanding of matrix similarity and Jordan normal form
  • Familiarity with characteristic and minimal polynomials
  • Knowledge of the Cayley-Hamilton theorem
  • Basic linear algebra concepts
NEXT STEPS
  • Study the properties of minimal polynomials in matrix theory
  • Learn about Jordan normal forms and their significance in linear algebra
  • Explore the Cayley-Hamilton theorem in depth
  • Read "Matrix Analysis" by Horn and Johnson for comprehensive insights
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in advanced matrix theory and similarity transformations.

ercan
Messages
2
Reaction score
0
I need help about similarity transformation in matrices.
Is there anyone who knows how can I decide whether "the two matrices having the same eigenvalues" are similar or not without using eigenvectors?

For example, following two matrices have the same characteristic polynomial. But they are not similar. Because matrix A has multiple jordan blocks for the double eigenvalue 1 while B doesn't have multiple jordan block.

A=[3 -3 -1 2;
4 -2 -3 6;
4 -3 -2 6;
3 0 -3 7]

B=[-4.6 -7 -3 -0.6;
3.4 6 2 0.4;
0.4 0 1 0.4;
-2.4 5 0 3.6]

I wonder how can I show that A and B are not similar without using their eigenvectors or without directly looking to their diagonal forms.
 
Physics news on Phys.org
The minimal polynomial could be what you're looking for...

Basically, given a matrix A, then the characteristic polynomial p(x) satisfies p(A)=0. (this is the Cayley-Hamilton theorem). The smallest polynomial which still has this property is called the minimal polynomial. The minimal polynomial will always divide the characteristic polynomial.

The minimal polynomial is mostly being used to give information about the Jordan normal form and it has the advantage that you don't need to know about the (generalized) eigenvectors...
 
Go check the book: Matrix Analysis by Horn and Johnson, it answers this question exactly, or will provide sufficient information.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
9
Views
2K