# Linear algebra, basis, diagonal matrix

1. Sep 4, 2011

### fluidistic

1. The problem statement, all variables and given/known data
Write the A matrix and the x vector into a basis in which A is diagonal.
$A=\begin{pmatrix} 0&-i&0&0&0 \\ i&0&0&0&0 \\ 0&0&3&0&0 \\ 0&0&0&1&-i \\ 0&0&0&i&-1 \end{pmatrix}$.
$x=\begin{pmatrix} 1 \\ a \\ i \\ b \\ -1 \end{pmatrix}$.

2. Relevant equations
A=P^(-1)A'P.

3. The attempt at a solution
I found out the eigenvalues (spectra in fact) to be $\sigma (A) = \{ -1,0,0,1,3 \}$.
I'm happy they told me A is diagonalizable; so I can avoid finding the Jordan form of A.
So I know how is A'. I think that in order to find P, I must find the eigenvectors associated with each eigenvalues. P would be the matrix whose columns are the eigenvectors encountered. So with $\lambda = -1$, I found $v_1=\begin{pmatrix} -i \\ i \\ 3 \\ 4-i \\ i+1 \end{pmatrix}$. But for $\lambda =0$ I reach the null vector as eigenvector, which I know is impossible.
Am I doing something wrong?

2. Sep 4, 2011

### I like Serena

Hi fluidistic!

What you're doing is right.
But for lambda=0 you can still find a vector that is not the null vector.
Actually there should be 2 linearly independent eigenvectors.
They span the kernel of A.

3. Sep 4, 2011

### fluidistic

Ah true, I had made an arithmetic error. If there was no 2 l.i. eigenvectors for lambda=0 then the matrix A wouldn't be diagonalizable.

The basis I found is $P=B=\begin{pmatrix} 1&0&0&-i&0 \\ -i&0&0&i&0 \\ 0&0&0&3&-1 \\ 0&-i&-i&4-i&1 \\ 0&1&5&1+i& \frac{3}{5} \end{pmatrix}$.
What they ask for is $A'=BAB^{-1}$ which is (should be if I didn't make any error) just a diagonal matrix whose entries are the eigenvalues I found. And x'=Bx.
Am I right?

4. Sep 5, 2011

### I like Serena

Didn't check for mistakes, but yes, you're right.

5. Sep 5, 2011

### lanedance

also not required, but couple of extra tips:

- As A is close to diagonal, so you can shortcut and find the eigenvalues & eigenvectors for the 3 small blocks that make up A. These will be eigenvectors of A with zeroes in the other entries and will lead to simpler eigenvectors than the ones you have.

- It always good to check by multiplying the matrix by the eigenvector you found and in fact I I think you need to do that. The eigenvectors you have don't look right at glance

- If all your eigenvectors all turn out to be orthogonal, and you normalise, then finding the inverse of P is simple - its the transpose conjugate (P is unitary).

- I think A is a normal matrix (A*A=AA*), if so it guarantees all eignespaces correponding to single eignevalues are orthogonal

Last edited: Sep 5, 2011
6. Sep 5, 2011

### I like Serena

Note that A=A* (A is equal to its conjugate transpose).
The so called spectral theorem tells us that this means that A is diagonizable.

7. Sep 5, 2011

### lanedance

So A is actually a hermitian matrix which guarantees real eigenvalues and diagonalisability

hermitian matricies are also a subset of normal matricies, which gurantees diagonalisability by a unitary matrix

Last edited: Sep 5, 2011
8. Sep 5, 2011

### vela

Staff Emeritus
I'll second lanedance's recommendation to check your eigenvectors. You should be able to see by inspection that (0, 0, 1, 0, 0) is an eigenvector of A.

I also found different eigenvalues for A, namely ±1, ±√2, and 3, so you should recheck that calculation first.