# Linear Algebra - Characteristic polynomials and similar matrices question

1. Feb 17, 2010

### zeion

1. The problem statement, all variables and given/known data

For each matrix A below, let T be the linear operator on R3 thathas matrix A relative to the basis A = {(1,0,0), (1,1,0), (1,1,1)}. Find the algebraic and geometric multiplicities of each eigenvalues, and a basis for each eigenspace.

a) A = $$\begin{bmatrix} 8&5&-5\\5&8&-5\\15&15&-12\end{bmatrix}$$

2. Relevant equations

3. The attempt at a solution

So I tried to find the eigenvalues normally and turns out that was pretty hard.. So I know that similar matrices have the same eigenvalues, then can I just take the eigenvalues of the matrix $$\begin{bmatrix} 1&1&1\\0&1&1\\0&0&1\end{bmatrix}$$

since it is similar to A? Or is it similar?

2. Feb 17, 2010

### VeeEight

The eigenvalues of similar matrices are the same but the eigenvectors may be different.

3. Feb 17, 2010

### vela

Staff Emeritus
The second matrix isn't similar to A. Two matrices A and B are similar if you can write

$$B=P^{-1}AP$$

for some invertible matrix P. You could use your matrix with the basis-vector columns as P.

4. Feb 17, 2010

### zeion

I calculated B but that doesn't seem to make finding eigenvalues any easier..?

5. Feb 17, 2010

### vela

Staff Emeritus
You just need to work it out. It's only a 3x3 matrix after all.

6. Feb 17, 2010

### zeion

Okay nice it seems it's just because I made a mistake in calculating the inverse of P..
turns out it's much easier to find the eigenvalues of B.

..or not. I don't understand why there is suddenly such a computational question when everything else is hardly as hard..

Last edited: Feb 17, 2010
7. Feb 18, 2010

### zeion

Can I row-reduce a matrix before subtracting lambda and then find the determinant? Or do I have to subtract lambda first?

8. Feb 18, 2010

### vela

Staff Emeritus
You have to subtract $\lambda$ first. Think about it. You can reduce any invertible matrix to the identity matrix. If you then subtracted $\lambda$, all the eigenvalues would be 1, which is obviously not the case for every invertible matrix.