Is matrix hermitian and its eigenvectors orthogonal?

In summary, the conversation discusses the calculation of a matrix and its properties, specifically whether it is Hermitian and if its eigenvectors are orthogonal. The participants also discuss the process of finding eigenvectors for specific eigenvalues and provide guidance on solving systems of linear equations.
  • #1
bugatti79
794
1
I calculate

1) ##\Omega=
\begin{bmatrix}
1 & 3 &1 \\
0 & 2 &0 \\
0& 1 & 4
\end{bmatrix}## as not Hermtian since ##\Omega\ne\Omega^{\dagger}## where##\Omega^{\dagger}=(\Omega^T)^*##

2) ##\Omega\Omega^{T}\ne I## implies eigenvectors are not orthogonal.

Is this correct?
 
Physics news on Phys.org
  • #2
The matrix ##M = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 3 \end{bmatrix}## does not satisfy ##M M^\dagger = I##, but its eigenvectors are orthogonal.
 
  • #3
Ok then my 2)assumption is not correct thus I proceeded to calculate the eigenvectors for the following eigenvalues ##\lambda=1,2,4## and then check for orthogonality

##\lambda=1##

##\begin{bmatrix}1& 3 & 1\\ 0& 2&0 \\ 0& 1 &4 \end{bmatrix}\begin{bmatrix}x_1\\ x_2\\ x_3\end{bmatrix}=\lambda \begin{bmatrix}x_1\\ x_2\\ x_3\end{bmatrix}##

In component form I get

##(1-1)x_1+3x_2+x_3=0##
##0+(2-1)x_2+0=0##
##0+x_2+(4-1)x_3=0##
I am confused here for 2 reasons

1) How to deal with ##(1-1)x_1##
2) the 2nd line suggest ##x_2=0## but on the 3rd line ##x_2=-3x_3##

Not sure how to proceed from here...
Any help will be appreciated.
 
  • #4
I'm not sure why ##x_1##, ##x_2##, and ##x_3## are coming into the process. To compute eigenvalues you solve for ##\lambda## by expanding ##det(M - \lambda I) = 0## then finding roots. You seem to be skipping ahead to the part where you know a specific lambda and want to find the associated vector.

The khan academy video on eigenvectors might help.
 
  • #5
Thanks for the video, very useful

I have expanded ##det(\Omega - \lambda I) = 0## to get

##\begin{bmatrix}1-\lambda& 3 & 1\\ 0& 2-\lambda&0 \\ 0& 1 &4-\lambda \end{bmatrix}##

For ##\lambda=4## say we get the following

##\begin{bmatrix}1-4& 3 & 1\\ 0& 2-4&0 \\ 0& 1 &4-4 \end{bmatrix}\begin{bmatrix}v_1\\ v_2\\ v_3\end{bmatrix}=0##

##-3v_1+3v_2+v_3=0##
##-2v_2=0##
##v_2=0##
If we let ##v_1=1## then ##v_3=3##. Our eigenvector for ##\lambda=4## is (1,0,3)

For ##\lambda=1## I have difficulty because

##\begin{bmatrix}1-1& 3 & 1\\ 0& 2-1&0 \\ 0& 1 &4-1 \end{bmatrix}\begin{bmatrix}v_1\\ v_2\\ v_3\end{bmatrix}=0##

##3v_2+v_3=0##
##v_2=0##
##v_2+3v_3=0## but I cannot determine what ##v_1## is...?
 
  • #6
You can check that for ##\lambda = 1## every vector with ##v_2=v_3=0## and ##v_1\in \mathbb{C}## is an eigenvector.
Do that.

I would advise brushing up on solving systems of linear equations. Because that's where your main issue seems to be located.
If you have a solution, always check that the vectors you found actually give the correct eigenvalues.
 
  • Like
Likes bugatti79
  • #7
bugatti79 said:
Thanks for the video, very useful

I have expanded ##det(\Omega - \lambda I) = 0## to get

##\begin{bmatrix}1-\lambda& 3 & 1\\ 0& 2-\lambda&0 \\ 0& 1 &4-\lambda \end{bmatrix}##

For ##\lambda=4## say we get the following

##\begin{bmatrix}1-4& 3 & 1\\ 0& 2-4&0 \\ 0& 1 &4-4 \end{bmatrix}\begin{bmatrix}v_1\\ v_2\\ v_3\end{bmatrix}=0##

##-3v_1+3v_2+v_3=0##
##-2v_2=0##
##v_2=0##
If we let ##v_1=1## then ##v_3=3##. Our eigenvector for ##\lambda=4## is (1,0,3)

For ##\lambda=1## I have difficulty because

##\begin{bmatrix}1-1& 3 & 1\\ 0& 2-1&0 \\ 0& 1 &4-1 \end{bmatrix}\begin{bmatrix}v_1\\ v_2\\ v_3\end{bmatrix}=0##

##3v_2+v_3=0##
##v_2=0##
##v_2+3v_3=0## but I cannot determine what ##v_1## is...?

##v_1## can be anything. You can choose it to be 1, for example. Remember that any multiple of an eigenvector is also an eigenvector with the same eigenvalue.
 
  • Like
Likes bugatti79
  • #8
OK guys, thank you.
I can see how that holds for ##\lambda=1##.

Onto my next problem :-)
 

1. What is a Hermitian matrix?

A Hermitian matrix is a square matrix that is equal to its own conjugate transpose, meaning that the complex conjugate of each element is equal to the element in the corresponding position of the transposed matrix. In other words, the matrix is symmetric along the main diagonal and the elements on the upper triangle are the complex conjugates of the elements on the lower triangle.

2. How can I determine if a matrix is Hermitian?

To determine if a matrix is Hermitian, you can check if the matrix is equal to its own conjugate transpose. This can also be done by checking if all the eigenvalues of the matrix are real numbers, as all Hermitian matrices have real eigenvalues.

3. What are the properties of a Hermitian matrix?

Some properties of a Hermitian matrix include having real eigenvalues and orthogonal eigenvectors, being diagonalizable, and having only real entries on the main diagonal.

4. What are orthogonal eigenvectors?

Orthogonal eigenvectors are eigenvectors of a matrix that are perpendicular to each other. This means that their dot product is equal to zero, and they span a space that is orthogonal to each other. In the case of Hermitian matrices, the eigenvectors are not only orthogonal but also have a length of 1, making them orthonormal.

5. How are Hermitian matrices and orthogonal eigenvectors related?

Hermitian matrices have the property of having orthogonal eigenvectors, meaning that each eigenvector is perpendicular to the other. This is because Hermitian matrices are symmetric along the main diagonal, which leads to the orthogonality of the eigenvectors. Additionally, the eigenvectors of a Hermitian matrix are not only orthogonal but also have a length of 1, making them orthonormal.

Similar threads

Replies
8
Views
1K
Replies
4
Views
1K
Replies
3
Views
796
Replies
2
Views
672
  • Quantum Physics
Replies
1
Views
1K
  • Quantum Physics
Replies
1
Views
614
  • Quantum Physics
2
Replies
43
Views
2K
Replies
2
Views
418
  • Quantum Physics
Replies
3
Views
938
Replies
4
Views
3K
Back
Top