MHB Shankar - Simultaneous Diagonalisation of Hermitian Matrices

Click For Summary
The discussion revolves around the simultaneous diagonalization of two Hermitian matrices, Omega and Lambda, focusing on their eigenvalues and eigenvectors. It highlights the importance of choosing the non-degenerate matrix for determining eigenvectors, as the degenerate matrix may not provide a complete basis. The conversation also clarifies that while eigenvectors of Hermitian matrices are orthogonal, they must be normalized to form an orthonormal basis for unitary transformations. Users are encouraged to apply the Gram-Schmidt process to ensure the eigenvectors are orthonormal. Overall, the thread emphasizes the necessity of careful selection and normalization of eigenvectors in the diagonalization process.
bugatti79
Messages
786
Reaction score
4
Asked to determine the eigenvalues and eigenvectors common to both of these matrices of

\Omega=\begin{bmatrix}1 &0 &1 \\ 0& 0 &0 \\ 1& 0 & 1\end{bmatrix} and \Lambda=\begin{bmatrix}2 &1 &1 \\ 1& 0 &-1 \\ 1& -1 & 2\end{bmatrix}

and then to verify under a unitary transformation that both can be simultaneously diagonalised. Since Omega is degenerate and Lambda is not, you must be prudent in deciding which matrix dictates the choice of basis.

1)What does he mean by being prudent in the choice of matrix?

2)There is only one common eigenvalue which is \lambda=2 I expect the same eigenvector for both matrices for this value of \lambda=2? Wolfram alpha shows different eigenvectors for the same \lambda value.
eigenvector '{'1,0,1'}','{'0,0,0'}','{'1,0,1'}' - Wolfram|Alpha

eigenvector '{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}' - Wolfram|Alpha3) To show Simultaneous Diagonalisation I applied the unitary transformation as

U^{\dagger} \Omega U and U^{\dagger} \Lambda U to diagonalise the matrices with its entries being the eigenvalues where U are the corresponding columns of eigenvectors.

However, wolfram shows

'{''{'1, 0, 1'}', '{'-1, 0, 1'}', '{'0, 1, 0'}''}''*''{''{'1,0,1'}','{'0,0,0'}','{'1,0,1'}''}''*''{''{'1,-1,0'}','{'0,0,1'}','{'1,1,0'}''}' - Wolfram|Alpha

'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}''*''{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}''*''{''{'1, -1, -1'}', '{'0, -1, 2'}', '{'1, 1, 1'}''}' - Wolfram|Alpha

Any ideas?
 
Physics news on Phys.org
Couple of comments:

1. Pick the non-degenerate matrix to get your eigenvectors. You're not guaranteed that the degenerate matrix's eigenvectors will span the space and be a basis.

2. You need the transformation to be unitary, which means the eigenvectors need to be orthonormal. Use Gram-Schmidt to orthonormalize the eigenbasis.
 
Ackbach said:
Couple of comments:

1. Pick the non-degenerate matrix to get your eigenvectors. You're not guaranteed that the degenerate matrix's eigenvectors will span the space and be a basis.

2. You need the transformation to be unitary, which means the eigenvectors need to be orthonormal. Use Gram-Schmidt to orthonormalize the eigenbasis.

2) I thought that hermitian matrices were orthgonal as per the 4th point of properties in link wiki https://en.wikipedia.org/wiki/Hermitian_matrix
Thats why i didn't orthogonalise them...

Eigenvectors of Hermitian Matrix
eigenvectors '{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}' - Wolfram|Alpha

Check are eigenvectors orthogonal ( I put the eigenvectors into a matrix)
'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}' orthogonal - Wolfram|Alpha

Is the wiki wrong?
 
bugatti79 said:
2) I thought that hermitian matrices were orthgonal as per the 4th point of properties in link wiki https://en.wikipedia.org/wiki/Hermitian_matrix
Thats why i didn't orthogonalise them...

Eigenvectors of Hermitian Matrix
eigenvectors '{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}' - Wolfram|Alpha

Check are eigenvectors orthogonal ( I put the eigenvectors into a matrix)
'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}' orthogonal - Wolfram|Alpha

Is the wiki wrong?

Ok, fair point about orthogonality. However, orthogonal does not imply orthonormal. An orthonormal basis is orthogonal AND every vector has length 1. You need an orthonormal set of eigenvectors to form an orthogonal matrix. That's not a typo: orthogonal matrix implies the columns are orthonormal.

So your process is simpler if your original matrices are Hermitian: once you get the eigenvectors, just normalize them.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
6
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 7 ·
Replies
7
Views
8K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
4K