Linear algebra, basis, diagonal matrix

Click For Summary
The discussion revolves around diagonalizing a given matrix A and finding its eigenvalues and eigenvectors. The eigenvalues identified are -1, 0, 1, and 3, with a focus on obtaining the corresponding eigenvectors to construct the matrix P. There is a consensus that for the eigenvalue 0, two linearly independent eigenvectors should exist, which is essential for A's diagonalizability. Additional insights suggest checking the eigenvectors for correctness and leveraging the properties of normal and Hermitian matrices to simplify the process. The final goal is to express A in a diagonal form and transform the vector x accordingly.
fluidistic
Gold Member
Messages
3,931
Reaction score
281

Homework Statement


Write the A matrix and the x vector into a basis in which A is diagonal.
A=\begin{pmatrix} 0&-i&0&0&0 \\ i&0&0&0&0 \\ 0&0&3&0&0 \\ 0&0&0&1&-i \\ 0&0&0&i&-1 \end{pmatrix}.
x=\begin{pmatrix} 1 \\ a \\ i \\ b \\ -1 \end{pmatrix}.

Homework Equations


A=P^(-1)A'P.


The Attempt at a Solution


I found out the eigenvalues (spectra in fact) to be \sigma (A) = \{ -1,0,0,1,3 \}.
I'm happy they told me A is diagonalizable; so I can avoid finding the Jordan form of A.
So I know how is A'. I think that in order to find P, I must find the eigenvectors associated with each eigenvalues. P would be the matrix whose columns are the eigenvectors encountered. So with \lambda = -1, I found v_1=\begin{pmatrix} -i \\ i \\ 3 \\ 4-i \\ i+1 \end{pmatrix}. But for \lambda =0 I reach the null vector as eigenvector, which I know is impossible.
Am I doing something wrong?
 
Physics news on Phys.org
Hi fluidistic! :smile:

What you're doing is right.
But for lambda=0 you can still find a vector that is not the null vector.
Actually there should be 2 linearly independent eigenvectors.
They span the kernel of A.
 
I like Serena said:
Hi fluidistic! :smile:

What you're doing is right.
But for lambda=0 you can still find a vector that is not the null vector.
Actually there should be 2 linearly independent eigenvectors.
They span the kernel of A.
Ah true, I had made an arithmetic error. If there was no 2 l.i. eigenvectors for lambda=0 then the matrix A wouldn't be diagonalizable.

The basis I found is P=B=\begin{pmatrix} 1&0&0&-i&0 \\ -i&0&0&i&0 \\ 0&0&0&3&-1 \\ 0&-i&-i&4-i&1 \\ 0&1&5&1+i& \frac{3}{5} \end{pmatrix}.
What they ask for is A'=BAB^{-1} which is (should be if I didn't make any error) just a diagonal matrix whose entries are the eigenvalues I found. And x'=Bx.
Am I right?
 
Didn't check for mistakes, but yes, you're right.
 
also not required, but couple of extra tips:

- As A is close to diagonal, so you can shortcut and find the eigenvalues & eigenvectors for the 3 small blocks that make up A. These will be eigenvectors of A with zeroes in the other entries and will lead to simpler eigenvectors than the ones you have.

- It always good to check by multiplying the matrix by the eigenvector you found and in fact I I think you need to do that. The eigenvectors you have don't look right at glance

- If all your eigenvectors all turn out to be orthogonal, and you normalise, then finding the inverse of P is simple - its the transpose conjugate (P is unitary).

- I think A is a normal matrix (A*A=AA*), if so it guarantees all eignespaces correponding to single eignevalues are orthogonal
 
Last edited:
Note that A=A* (A is equal to its conjugate transpose).
The so called spectral theorem tells us that this means that A is diagonizable.
 
I like Serena said:
Note that A=A* (A is equal to its conjugate transpose).
The so called spectral theorem tells us that this means that A is diagonalizable.
So A is actually a hermitian matrix which guarantees real eigenvalues and diagonalisability

hermitian matricies are also a subset of normal matricies, which gurantees diagonalisability by a unitary matrix
 
Last edited:
I'll second lanedance's recommendation to check your eigenvectors. You should be able to see by inspection that (0, 0, 1, 0, 0) is an eigenvector of A.

I also found different eigenvalues for A, namely ±1, ±√2, and 3, so you should recheck that calculation first.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
Replies
9
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K