Linear algebra, basis, diagonal matrix

Click For Summary

Homework Help Overview

The discussion revolves around diagonalizing a given matrix A and finding the corresponding basis vectors. The matrix A is a 5x5 complex matrix, and the original poster is tasked with expressing it in a diagonal form using a basis derived from its eigenvalues and eigenvectors.

Discussion Character

  • Exploratory, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to find the eigenvalues and eigenvectors of matrix A, noting that they have identified the eigenvalues but are struggling with the eigenvector for the eigenvalue of zero. Some participants suggest that there should be two linearly independent eigenvectors associated with the eigenvalue of zero, prompting a reevaluation of the calculations.

Discussion Status

Participants are actively engaging with the original poster's approach, providing affirmations and suggestions for checking calculations. There is a recognition of the need to verify the eigenvectors and eigenvalues, with some participants offering additional tips on simplifying the process and confirming the properties of the matrix A.

Contextual Notes

There is mention of the spectral theorem and properties of hermitian matrices, which are relevant to the discussion of diagonalizability. The original poster's calculations are under scrutiny, particularly regarding the eigenvalues and eigenvectors, with some discrepancies noted by other participants.

fluidistic
Gold Member
Messages
3,934
Reaction score
286

Homework Statement


Write the A matrix and the x vector into a basis in which A is diagonal.
[itex]A=\begin{pmatrix} 0&-i&0&0&0 \\ i&0&0&0&0 \\ 0&0&3&0&0 \\ 0&0&0&1&-i \\ 0&0&0&i&-1 \end{pmatrix}[/itex].
[itex]x=\begin{pmatrix} 1 \\ a \\ i \\ b \\ -1 \end{pmatrix}[/itex].

Homework Equations


A=P^(-1)A'P.


The Attempt at a Solution


I found out the eigenvalues (spectra in fact) to be [itex]\sigma (A) = \{ -1,0,0,1,3 \}[/itex].
I'm happy they told me A is diagonalizable; so I can avoid finding the Jordan form of A.
So I know how is A'. I think that in order to find P, I must find the eigenvectors associated with each eigenvalues. P would be the matrix whose columns are the eigenvectors encountered. So with [itex]\lambda = -1[/itex], I found [itex]v_1=\begin{pmatrix} -i \\ i \\ 3 \\ 4-i \\ i+1 \end{pmatrix}[/itex]. But for [itex]\lambda =0[/itex] I reach the null vector as eigenvector, which I know is impossible.
Am I doing something wrong?
 
Physics news on Phys.org
Hi fluidistic! :smile:

What you're doing is right.
But for lambda=0 you can still find a vector that is not the null vector.
Actually there should be 2 linearly independent eigenvectors.
They span the kernel of A.
 
I like Serena said:
Hi fluidistic! :smile:

What you're doing is right.
But for lambda=0 you can still find a vector that is not the null vector.
Actually there should be 2 linearly independent eigenvectors.
They span the kernel of A.
Ah true, I had made an arithmetic error. If there was no 2 l.i. eigenvectors for lambda=0 then the matrix A wouldn't be diagonalizable.

The basis I found is [itex]P=B=\begin{pmatrix} 1&0&0&-i&0 \\ -i&0&0&i&0 \\ 0&0&0&3&-1 \\ 0&-i&-i&4-i&1 \\ 0&1&5&1+i& \frac{3}{5} \end{pmatrix}[/itex].
What they ask for is [itex]A'=BAB^{-1}[/itex] which is (should be if I didn't make any error) just a diagonal matrix whose entries are the eigenvalues I found. And x'=Bx.
Am I right?
 
Didn't check for mistakes, but yes, you're right.
 
also not required, but couple of extra tips:

- As A is close to diagonal, so you can shortcut and find the eigenvalues & eigenvectors for the 3 small blocks that make up A. These will be eigenvectors of A with zeroes in the other entries and will lead to simpler eigenvectors than the ones you have.

- It always good to check by multiplying the matrix by the eigenvector you found and in fact I I think you need to do that. The eigenvectors you have don't look right at glance

- If all your eigenvectors all turn out to be orthogonal, and you normalise, then finding the inverse of P is simple - its the transpose conjugate (P is unitary).

- I think A is a normal matrix (A*A=AA*), if so it guarantees all eignespaces correponding to single eignevalues are orthogonal
 
Last edited:
Note that A=A* (A is equal to its conjugate transpose).
The so called spectral theorem tells us that this means that A is diagonizable.
 
I like Serena said:
Note that A=A* (A is equal to its conjugate transpose).
The so called spectral theorem tells us that this means that A is diagonalizable.
So A is actually a hermitian matrix which guarantees real eigenvalues and diagonalisability

hermitian matricies are also a subset of normal matricies, which gurantees diagonalisability by a unitary matrix
 
Last edited:
I'll second lanedance's recommendation to check your eigenvectors. You should be able to see by inspection that (0, 0, 1, 0, 0) is an eigenvector of A.

I also found different eigenvalues for A, namely ±1, ±√2, and 3, so you should recheck that calculation first.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
Replies
9
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K