Proving Hermitian if it has real eigenvalues

In summary: If not, then he would need to check whether the eigenvectors are orthonormal and diagonalizable, which is a more involved task.
  • #1
maddogtheman
18
0
If you had an operator A-hat whose eigenvectors form a complete basis for the Hilbert space has only real eigenvalue how would you prove that is was Hermitian?
 
Physics news on Phys.org
  • #2
You wouldn't. That's not even true when restricted to 2x2 matrices whose entries are all real!

(And please don't post the same thing multiple times. We don't tolerate it on this forum)
 
  • #3
Hurkyl said:
You wouldn't. That's not even true when restricted to 2x2 matrices whose entries are all real!

(And please don't post the same thing multiple times. We don't tolerate it on this forum)

Could it be you confused the problem with some claim about spectrum? If 2x2 matrix is diagonalizable with real eigenvalues, isn't it quite Hermitian then? And isn't eigenvectors forming a complete basis the same thing as diagonalizability?
 
  • #4
jostpuur said:
If 2x2 matrix is diagonalizable with real eigenvalues, isn't it quite Hermitian then?
A diagonal matrix with real eigenvalues is Hermitian. But not necessarily if the matrix is merely diagonalizable with real eigenvalues. Why should [itex](PDP^{-1})^* = PDP^{-1}[/itex]?
 
  • #5
I see... so symmetry of a matrix isn't always conserved in coordinate transformations. It would have fixed the problem, if maddogtheman had mentioned his basis to be orthogonal?
 
  • #6
I assumed the basis to be orthogonal but I think it makes for a rigorous proof.
 
  • #7
doesn't make*
 
  • #8
The problem is that [tex]T\mapsto T^{\dagger}[/tex] does not commute with all linear coordinate transformations, so actually we never should speak about a linear mapping being Hermitian. A simple fact, which I had never thought about before. Instead, we should speak about linear mapping being Hermitian with respect to some basis (or with respect to a certain kind of collection of different basis).

Examine

[tex]
(\psi_i | (T-T^{\dagger})\psi_j)
[/tex]

with orthogonal eigenvectors [tex]\psi_k[/tex], which satisfy [tex]T\psi_k = \lambda_k \psi_k[/tex].

(edit: I first wrote complicated instructions, and then edited them simpler)
 
Last edited:
  • #9
For example, if

[tex]
T = \left(\begin{array}{cc} 1 & 2 \\ 0 & 3 \\ \end{array}\right)
[/tex]

then the vectors

[tex]
u_1 = \left(\begin{array}{c} 1 \\ 0 \\ \end{array}\right),\quad u_2 = \left(\begin{array}{c} 1 \\ 1 \\ \end{array}\right)
[/tex]

are the eigenvectors with eigenvalues [tex]\lambda_1 = 1[/tex] and [tex]\lambda_2 = 3[/tex]. So the matrix is diagonalizable with real eigenvalues, but it is not Hermitian since [tex]T \neq T^{\dagger}[/tex].

Lols. :blushing:

But if we define a new inner product [tex](u_i|u_j)=\delta_{ij}[/tex], and a new conjugate mapping [tex]T\mapsto T^{\dagger}[/tex] by [tex](\psi|T^{\dagger}\phi)=(T\psi|\phi)[/tex] with the new inner product, then [tex]T[/tex] becomes Hermitian.
 
  • #10
jostpuur said:
The problem is that [tex]T\mapsto T^{\dagger}[/tex] does not commute with all linear coordinate transformations, so actually we never should speak about a linear mapping being Hermitian. A simple fact, which I had never thought about before. Instead, we should speak about linear mapping being Hermitian with respect to some basis (or with respect to a certain kind of collection of different basis).

This is because the adjoint of an operator is only defined with respect to an inner product. Specifically, it is the operator [itex]A^\dagger[/itex] such that [itex](u,Av) = (A^\dagger u,v)[/itex] for all vectors u and v. In terms of matrices, as long as we write the coefficients in an orthonormal basis, this just amounts to taking the conjugate transpose. And this will also be true in any other orthonormal basis, where the other orthonormal bases are precisely the ones which can be reached by a unitary transformation (with which the adjoint operation does commute).

As for the OP, it depends on whether the eigenvectors are orthonormal. If so, then the answer should be easy based on what I just said.
 

1. What does it mean for a matrix to be Hermitian?

A matrix is Hermitian if it is equal to its own conjugate transpose, meaning that the matrix is symmetric and its diagonal elements are real numbers.

2. How can I determine if a matrix is Hermitian?

To determine if a matrix is Hermitian, you can check if it is equal to its own conjugate transpose. This can be done by taking the transpose of the matrix and then taking the complex conjugate of each element. If the resulting matrix is equal to the original matrix, then it is Hermitian.

3. What are real eigenvalues?

Real eigenvalues are numbers that satisfy the characteristic equation of a matrix. They are also the roots of the characteristic polynomial and correspond to the eigenvalues of the matrix.

4. Why is it important for a Hermitian matrix to have real eigenvalues?

In quantum mechanics, Hermitian matrices represent observable physical quantities. Real eigenvalues of a Hermitian matrix correspond to real measurements, which are essential for accurate and meaningful predictions in quantum mechanics.

5. Can a matrix have real eigenvalues but not be Hermitian?

Yes, a matrix can have real eigenvalues but not be Hermitian. For a matrix to be Hermitian, it must not only have real eigenvalues but also be equal to its own conjugate transpose. A non-Hermitian matrix may have real eigenvalues but will not satisfy the condition of being equal to its own conjugate transpose.

Similar threads

  • Quantum Physics
Replies
3
Views
699
  • Quantum Physics
Replies
2
Views
965
Replies
3
Views
1K
Replies
9
Views
1K
Replies
13
Views
2K
  • Quantum Physics
Replies
6
Views
1K
Replies
3
Views
863
Replies
21
Views
377
Replies
27
Views
2K
  • Quantum Physics
Replies
27
Views
2K
Back
Top