What are the Eigenvalues and Eigenvectors of Similar Matrices

In summary: That is the only way to get similar matrices.As an exercise, calculate the inverse of P (1/3 times the 2 by 2 matrix with -1, 2 as elements in the reverse order on the right to get the inverse) and calculate PBP^{-1}.In summary, two matrices A and B are considered similar if they have the same characteristic polynomial, which is defined as the determinant of A minus a scalar times the identity matrix. This means that they also have the same eigenvalues, but they may not necessarily have the same eigenvectors. To prove this, one can use the equation Av=λv and then show that AP^-1Pv=λv for any invertible matrix P
  • #1
Jennifer1990
55
0

Homework Statement


Let A and B be similar matrices
a)Prove that A and B have the same eigenvalues


Homework Equations


None


The Attempt at a Solution


Firstly, i don't see how this can even be possible unless the matrices are exactly the same :S
 
Physics news on Phys.org
  • #2
So you think that eigenvalues uniquely characterize a matrix? What about

10
01

and

11
01

for example?

You've put 'none' for relevant equations. That isn't true - there's a definition of 'similar' and many for 'eigenvalue'. Try it. HINT: polynomials.
 
  • #3
EDIT: Changed "equation" to "polynomial"

You have to show that A and B=P^-1AP (for some invertible matrix P) have the same characteristic polynomial.
 
Last edited:
  • #4
Oh, I think that way is much too complicated!

Do it directly from the equation:
If [itex]Av= \lambda v[/itex] then, for any invertible P, [itex]P^{-1}Av= \lambda P^{-1}v[/itex]. Now define [itex]u= P^{-1}v[/itex].
 
  • #5
HallsofIvy said:
Oh, I think that way is much too complicated!

Do it directly from the equation:
If [itex]Av= \lambda v[/itex] then, for any invertible P, [itex]P^{-1}Av= \lambda P^{-1}v[/itex]. Now define [itex]u= P^{-1}v[/itex].

Your way is too easy. :smile:
 
  • #6
or let [tex] Av = \lambda v [/tex]

then [tex] AP^{-1}Pv = \lambda v [/tex] and go from there
 
  • #7
what do u mean by the same characteristic equation?
 
  • #8
Jennifer1990 said:
what do u mean by the same characteristic equation?


EDIT: changed "equation" to "polynomial"


The the characteristic polynomial of matrix A is [tex] det(A- \lambda I). [/tex]

The characterisitc polynomial of matrix B is [tex] det(B - \lambda I) = det(PAP^{-1} - \lambda I) [/tex]

so show that [tex] det(A- \lambda I) = det(PAP^{-1} - \lambda I) [/tex]


But there are easier ways as HallsofIvy noted.


Start with [tex] Av = \lambda v[/tex] where v is a nonzero vector

then [tex] AP^{-1}Pv = \lambda v [/tex] since [tex] P^{-1}P = I [/tex]

Do you know what to do next?
 
Last edited:
  • #9
A and B are similar if and only if they both represent the same linear map, with respect to two possibly different bases. Eigenvalues are defined independently of what basis, if any, you choose. QED.
 
  • #10
Random Variable said:
The the characteristic equation of matrix A is [tex] det(A- \lambda I). [/tex]
Make that "The characteristic equation of matrix A is [tex] det(A- \lambda I) = 0. [/tex]"
For it to be an equation, it at least has to have an equals sign.
 
  • #11
Mark44 said:
Make that "The characteristic equation of matrix A is [tex] det(A- \lambda I) = 0. [/tex]"
For it to be an equation, it at least has to have an equals sign.

Sorry. What I should have said is that they have the same characteristic POLYNOMIAL. :redface:
 
  • #12
Av = lambda v
(AP^-1 P)v = lambda v
Bv = lambda v

i think?
 
  • #13
oh wait...B = P^-1 AP ...so what i said is wrong...

how can i manipulate A P^-1 P to look like P^-1 AP?
 
  • #14
ohhh i see...

since they can have the same eigenvalues, does this mean that the matrices can also have the same eigenvectors?
 
  • #15
Jennifer1990 said:
ohhh i see...

since they can have the same eigenvalues, does this mean that the matrices can also have the same eigenvectors?
They can, but it's not likely.
 
  • #16
I just tried several similar matrices but they all share the same eigenvector o_O
Can i get an example where two similar matrices have different eigenvectors?
 
  • #17
I'm surprised you were able to find similar matrices that had the same eigenvectors!

[tex]A= \begin{bmatrix}2 & 0 \\ 0 & 3\end{bmatrix}[/tex]
has, obviously, 2 and 3 as eigenvalues with corresponding eigenvectors <1 0> and <0 1>.

[tex]B= \begin{bmatrix}1 & -1 \\ 2 & 4\end{bmatrix}[/tex]
has the same eigenvalues with corresponding eigenvectors <1, -1> and <1, -2>.

All I did was start with the obvious diagonal matrix, A, choose a simple invertible P:
[tex]P= \begin{bmatrix}2 & 1 \\ 1 & 1\end{bmatrix}[/tex]
and calculate [itex]B= P^{-1}AP[/itex].
 

1. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important concepts in linear algebra that are used to analyze and understand linear transformations. Eigenvalues are scalar values that represent how a linear transformation stretches or shrinks a vector, while eigenvectors are the corresponding vectors that are only scaled by the linear transformation.

2. How do you find eigenvalues and eigenvectors?

Eigenvalues and eigenvectors can be found by solving the characteristic equation of a given matrix. The characteristic equation is formed by setting the determinant of the matrix minus a scalar value equal to zero. The eigenvalues are the solutions to this equation, and the corresponding eigenvectors can be found by solving the system of equations formed by substituting the eigenvalues into the original matrix.

3. What is the significance of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important because they provide a way to understand how a linear transformation changes a vector. They can also be used to simplify matrix operations and solve systems of differential equations.

4. Can a matrix have more than one eigenvalue?

Yes, a matrix can have multiple eigenvalues. The number of eigenvalues a matrix has is equal to its dimension. However, some matrices may have repeated eigenvalues.

5. How are eigenvalues and eigenvectors used in data analysis?

Eigenvalues and eigenvectors are commonly used in data analysis, particularly in principal component analysis (PCA). In PCA, the eigenvectors of a covariance matrix are used to transform the data into a new coordinate system, where the first few eigenvectors are the most important directions of variation in the data. This allows for dimensionality reduction and identification of important features in the data.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
499
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
362
  • Calculus and Beyond Homework Help
Replies
2
Views
514
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
19
Views
3K
Back
Top