I Hermitian operators, matrices and basis

Trixie Mattel
Messages
28
Reaction score
0
Hello, I would just like some help clearing up some pretty basic things about hermitian operators and matricies.

I am aware that operators can be represented by matricies. And I think I am right in saying that depending on the basis used the matrices will look different, but all our valid representations of the operator.

As I understand it, there exists a basis where hermitian operators can be represented by a diagonal matrix

Is the only basis that this can occur in the basis of the eigenvectors of the hermitian operators. And the diagonal matrix elements of the hermitian operators are the eigenvalues of the operator??

In summary I am asking, is the only basis in which a hermitian operator is represented by a diagonal basis the basis of the eigenvectors? And for the diagonal matrix are the elements the eigenvalues of the operator?Thank you
 
Mathematics news on Phys.org
Trixie Mattel said:
Hello, I would just like some help clearing up some pretty basic things about hermitian operators and matricies.

I am aware that operators can be represented by matricies. And I think I am right in saying that depending on the basis used the matrices will look different, but all our valid representations of the operator.

As I understand it, there exists a basis where hermitian operators can be represented by a diagonal matrix

Is the only basis that this can occur in the basis of the eigenvectors of the hermitian operators. And the diagonal matrix elements of the hermitian operators are the eigenvalues of the operator??

In summary I am asking, is the only basis in which a hermitian operator is represented by a diagonal basis the basis of the eigenvectors? And for the diagonal matrix are the elements the eigenvalues of the operator?Thank you

Given any basis |\psi_j\rangle, you can use that basis to represent arbitrary states as column matrices. Letting \mathcal{R}(|\psi\rangle) mean the matrix representation of |\psi\rangle, we can choose \mathcal{R} so that:

\mathcal{R}(|\psi_1\rangle) = \left( \begin{array} \\ 1 \\ 0 \\ . \\ . \\ . \end{array} \right)

\mathcal{R}(|\psi_2\rangle) = \left( \begin{array} \\ 0 \\ 1 \\ . \\ . \\ . \end{array} \right)

\mathcal{R}(|\psi_3\rangle) = \left( \begin{array} \\ 0 \\ 0 \\ 1 \\ 0 \\ . \\ . \\ . \end{array} \right)

To say that an operator \hat{O} is diagonal in this basis is to say that its representation is given by:

\mathcal{R}(\hat{O}) = \left( \begin{array} \\ \lambda_1 & 0 & ... \\ 0 & \lambda_2 & 0 & ... \\ 0 & 0 & \lambda_3 & ... \\ . \\ . \\ . \end{array} \right)

Yes, \lambda_1, \lambda_2, ... are the eigenvalues of \hat{O} and |\psi_j\rangle are the eigenvectors.
 
  • Like
Likes Trixie Mattel
Trixie Mattel said:
In summary I am asking, is the only basis in which a hermitian operator is represented by a diagonal basis the basis of the eigenvectors? And for the diagonal matrix are the elements the eigenvalues of the operator?

Yes, but note that you can put the eigenvalues to many different orders on the diagonal, in which case different unit vectors of Rn correspond to the values. Also, an operator can be defined on an infinite-dimensional vector space, in which case you can't write it as a matrix.
 
hilbert2 said:
Also, an operator can be defined on an infinite-dimensional vector space, in which case you can't write it as a matrix.

Why not?
 
PeroK said:
Why not?

Well, you can do it if the basis vectors can be ordered in such a way that there's some obvious pattern in the matrix elements, but think about a vector space that has a basis with the same cardinality as the set of real numbers.
 
hilbert2 said:
Well, you can do it if the basis vectors can be ordered in such a way that there's some obvious pattern in the matrix elements, but think about a vector space that has a basis with the same cardinality as the set of real numbers.

Got it, thank you!
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
I'm interested to know whether the equation $$1 = 2 - \frac{1}{2 - \frac{1}{2 - \cdots}}$$ is true or not. It can be shown easily that if the continued fraction converges, it cannot converge to anything else than 1. It seems that if the continued fraction converges, the convergence is very slow. The apparent slowness of the convergence makes it difficult to estimate the presence of true convergence numerically. At the moment I don't know whether this converges or not.
Back
Top