Shankar - Simultaneous Diagonalisation of Hermitian Matrices

Click For Summary

Discussion Overview

The discussion revolves around the simultaneous diagonalization of two Hermitian matrices, specifically the matrices \(\Omega\) and \(\Lambda\). Participants explore the implications of degeneracy in eigenvalues, the choice of basis for diagonalization, and the conditions necessary for a unitary transformation to be valid.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions the meaning of being prudent in choosing which matrix dictates the basis for diagonalization, given that \(\Omega\) is degenerate while \(\Lambda\) is not.
  • Another participant suggests that the non-degenerate matrix should be used to obtain eigenvectors, as the degenerate matrix's eigenvectors may not span the space adequately.
  • Concerns are raised about the orthogonality of eigenvectors derived from Hermitian matrices, with one participant stating that they believed Hermitian matrices' eigenvectors are orthogonal, leading to confusion regarding the need for orthonormalization.
  • There is a clarification that orthogonal does not imply orthonormal, and that an orthonormal basis is required for forming an orthogonal matrix.
  • Participants reference Wolfram Alpha outputs for eigenvalues and eigenvectors, noting discrepancies in expected results and prompting further inquiry into the diagonalization process.

Areas of Agreement / Disagreement

Participants express differing views on the choice of matrix for obtaining eigenvectors and the necessity of orthonormalization. There is no consensus on the implications of the outputs from Wolfram Alpha or the orthogonality of the eigenvectors.

Contextual Notes

Limitations include the potential misunderstanding of the properties of eigenvectors of Hermitian matrices, particularly regarding orthogonality and normalization. The discussion also highlights the complexity introduced by degeneracy in eigenvalues.

bugatti79
Messages
786
Reaction score
4
Asked to determine the eigenvalues and eigenvectors common to both of these matrices of

\Omega=\begin{bmatrix}1 &0 &1 \\ 0& 0 &0 \\ 1& 0 & 1\end{bmatrix} and \Lambda=\begin{bmatrix}2 &1 &1 \\ 1& 0 &-1 \\ 1& -1 & 2\end{bmatrix}

and then to verify under a unitary transformation that both can be simultaneously diagonalised. Since Omega is degenerate and Lambda is not, you must be prudent in deciding which matrix dictates the choice of basis.

1)What does he mean by being prudent in the choice of matrix?

2)There is only one common eigenvalue which is \lambda=2 I expect the same eigenvector for both matrices for this value of \lambda=2? Wolfram alpha shows different eigenvectors for the same \lambda value.
eigenvector '{'1,0,1'}','{'0,0,0'}','{'1,0,1'}' - Wolfram|Alpha

eigenvector '{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}' - Wolfram|Alpha3) To show Simultaneous Diagonalisation I applied the unitary transformation as

U^{\dagger} \Omega U and U^{\dagger} \Lambda U to diagonalise the matrices with its entries being the eigenvalues where U are the corresponding columns of eigenvectors.

However, wolfram shows

'{''{'1, 0, 1'}', '{'-1, 0, 1'}', '{'0, 1, 0'}''}''*''{''{'1,0,1'}','{'0,0,0'}','{'1,0,1'}''}''*''{''{'1,-1,0'}','{'0,0,1'}','{'1,1,0'}''}' - Wolfram|Alpha

'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}''*''{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}''*''{''{'1, -1, -1'}', '{'0, -1, 2'}', '{'1, 1, 1'}''}' - Wolfram|Alpha

Any ideas?
 
Physics news on Phys.org
Couple of comments:

1. Pick the non-degenerate matrix to get your eigenvectors. You're not guaranteed that the degenerate matrix's eigenvectors will span the space and be a basis.

2. You need the transformation to be unitary, which means the eigenvectors need to be orthonormal. Use Gram-Schmidt to orthonormalize the eigenbasis.
 
Ackbach said:
Couple of comments:

1. Pick the non-degenerate matrix to get your eigenvectors. You're not guaranteed that the degenerate matrix's eigenvectors will span the space and be a basis.

2. You need the transformation to be unitary, which means the eigenvectors need to be orthonormal. Use Gram-Schmidt to orthonormalize the eigenbasis.

2) I thought that hermitian matrices were orthgonal as per the 4th point of properties in link wiki https://en.wikipedia.org/wiki/Hermitian_matrix
Thats why i didn't orthogonalise them...

Eigenvectors of Hermitian Matrix
eigenvectors '{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}' - Wolfram|Alpha

Check are eigenvectors orthogonal ( I put the eigenvectors into a matrix)
'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}' orthogonal - Wolfram|Alpha

Is the wiki wrong?
 
bugatti79 said:
2) I thought that hermitian matrices were orthgonal as per the 4th point of properties in link wiki https://en.wikipedia.org/wiki/Hermitian_matrix
Thats why i didn't orthogonalise them...

Eigenvectors of Hermitian Matrix
eigenvectors '{''{'2,1,1'}','{'1,0,-1'}','{'1,-1,2'}''}' - Wolfram|Alpha

Check are eigenvectors orthogonal ( I put the eigenvectors into a matrix)
'{''{'1, 0, 1'}', '{'-1, -1, 1'}', '{'-1, 2, 1'}''}' orthogonal - Wolfram|Alpha

Is the wiki wrong?

Ok, fair point about orthogonality. However, orthogonal does not imply orthonormal. An orthonormal basis is orthogonal AND every vector has length 1. You need an orthonormal set of eigenvectors to form an orthogonal matrix. That's not a typo: orthogonal matrix implies the columns are orthonormal.

So your process is simpler if your original matrices are Hermitian: once you get the eigenvectors, just normalize them.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
6
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 7 ·
Replies
7
Views
8K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K