Linear Algebra - Characteristic polynomials and similar matrices question

AI Thread Summary
The discussion revolves around finding the algebraic and geometric multiplicities of eigenvalues for a given matrix A and its relationship with a similar matrix B. Participants clarify that similar matrices share eigenvalues but not necessarily eigenvectors, emphasizing the need to establish similarity through an invertible matrix P. The process of calculating eigenvalues is debated, particularly the order of operations when dealing with row reduction and the subtraction of lambda. It is confirmed that lambda must be subtracted first before any row reduction to avoid incorrect results. The conversation highlights the complexities involved in eigenvalue calculations, especially for 3x3 matrices.
zeion
Messages
455
Reaction score
1

Homework Statement



For each matrix A below, let T be the linear operator on R3 thathas matrix A relative to the basis A = {(1,0,0), (1,1,0), (1,1,1)}. Find the algebraic and geometric multiplicities of each eigenvalues, and a basis for each eigenspace.

a) A = <br /> \begin{bmatrix} 8&amp;5&amp;-5\\5&amp;8&amp;-5\\15&amp;15&amp;-12\end{bmatrix} <br /> <br />

Homework Equations


The Attempt at a Solution



So I tried to find the eigenvalues normally and turns out that was pretty hard.. So I know that similar matrices have the same eigenvalues, then can I just take the eigenvalues of the matrix <br /> \begin{bmatrix} 1&amp;1&amp;1\\0&amp;1&amp;1\\0&amp;0&amp;1\end{bmatrix} <br /> <br />

since it is similar to A? Or is it similar?
 
Physics news on Phys.org
The eigenvalues of similar matrices are the same but the eigenvectors may be different.
 
The second matrix isn't similar to A. Two matrices A and B are similar if you can write

B=P^{-1}AP

for some invertible matrix P. You could use your matrix with the basis-vector columns as P.
 
I calculated B but that doesn't seem to make finding eigenvalues any easier..?
 
You just need to work it out. It's only a 3x3 matrix after all.
 
Okay nice it seems it's just because I made a mistake in calculating the inverse of P..
turns out it's much easier to find the eigenvalues of B.

..or not. I don't understand why there is suddenly such a computational question when everything else is hardly as hard..
 
Last edited:
Can I row-reduce a matrix before subtracting lambda and then find the determinant? Or do I have to subtract lambda first?
 
You have to subtract \lambda first. Think about it. You can reduce any invertible matrix to the identity matrix. If you then subtracted \lambda, all the eigenvalues would be 1, which is obviously not the case for every invertible matrix.
 

Similar threads

Replies
19
Views
2K
Replies
21
Views
2K
Replies
32
Views
2K
Replies
69
Views
8K
Replies
1
Views
3K
Replies
11
Views
5K
Replies
4
Views
2K
Replies
5
Views
2K
Back
Top