Proving simultaneous eigenvectors for commuting operators

Click For Summary

Homework Help Overview

The discussion revolves around proving the existence of simultaneous eigenvectors for two commuting operators in quantum mechanics, particularly in the context of degenerate eigenvalues. The original poster expresses difficulty in extending their proof from non-degenerate to degenerate cases.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to prove that if two operators commute, then any eigenvector of one operator can be expressed as a linear combination of eigenvectors of the other operator, particularly focusing on the degenerate case.
  • Some participants suggest expanding eigenvectors in terms of a complete set of eigenvectors and acting on these with the commuting operator to explore relationships.
  • Others question the validity of assuming that any linear combination of eigenvectors of one operator is also an eigenvector of the other, providing a counterexample to illustrate the limitations of this assumption.

Discussion Status

The discussion is ongoing, with participants exploring various interpretations and approaches to the problem. Some guidance has been offered regarding the use of complete sets of eigenvectors and the implications of linear combinations, but no consensus has been reached on the proof for the degenerate case.

Contextual Notes

Participants note that the assumption of a complete set of eigenvectors for both operators is crucial for the discussion, and there is an acknowledgment of the limitations of the proof when operators are not linear. The original poster is working within the constraints of a homework assignment, seeking hints rather than complete solutions.

chrisd
Messages
5
Reaction score
0

Homework Statement


In my quantum class we learned that if two operators commute, we can always find a set of simultaneous eigenvectors for both operators. I'm having trouble proving this for the case of degenerate eigenvalues.

Homework Equations


Commutator: [A,B]=AB-BA
Eigenvalue equation:A \mid v \rangle = a \mid v \rangle

The Attempt at a Solution


Start off by assuming operators A and B commute so AB=BA.
I think I have the proof for non-degenerate eigenvalues correct:
A \mid v \rangle = a \mid v \rangle
BA \mid v \rangle = Ba \mid v \rangle
A(B \mid v \rangle) = a(B \mid v \rangle)

So B \mid v \rangle is also an eigenvector of A associated with eigenvalue a.
If a is non-degenerate, B \mid v \rangle must be the same eigenvector as \mid v \rangle, only multiplied by a scalar.

B \mid v\rangle=b\mid v \rangle which is just the eigenvalue equation for B.

For the degenerate case I'm stuck. I can prove that if \mid v_1 \rangle,\mid v_2 \rangle,...,\mid v_n \rangle are eigenvectors of A associated with eigenvalue a, any linear combination of these eigenvectors is also an eigenvector of A with eigenvalue a.
A (c_1\mid v_1 \rangle + c_2\mid v_2 \rangle)
=c_1A\mid v_1 \rangle + c_2A\mid v_2 \rangle
=c_1a\mid v_1 \rangle + c_2a\mid v_2 \rangle
=a(c_1\mid v_1 \rangle + c_2\mid v_2 \rangle)

and so from the fact that B \mid v_i \rangle is an eigenvector of A, B \mid v_i \rangle is a linear combination of the eigenvectors of A associated with eigenvalue a...

B \mid v_i \rangle= c_1\mid v_1 \rangle + c_2\mid v_2 \rangle + ... + c_n\mid v_n \rangle

so I know B \mid v_i \rangle exists within the eigenspace of a, but I'm not sure how to use this to prove that \mid v_i \rangle is an eigenvector of B.

Any hints would be appreciated.
 
Physics news on Phys.org
let A and B have a complete set of eigenvectors. expand any eigenvector of A in terms of the set of eigenvectors of B. act on this result with B, and you will find that the set of eigenfunctions of B is an eigenfunction for B acting on any eigenvector of A. The coefficients can be degenerate in this proof without any burden on the generality.
alternatively read page 24 of Quantum Mechanics by Ballentine
 
chrisd said:
so I know B \mid v_i \rangle exists within the eigenspace of a, but I'm not sure how to use this to prove that \mid v_i \rangle is an eigenvector of B.

Any hints would be appreciated.
You can't prove that because it's generally not true for an arbitrary eigenvector of A with eigenvalue a. For example, suppose in some basis, A is represented by the matrix ##\begin{pmatrix} a & 0 \\ 0 & a \end{pmatrix}## and B is represented by the matrix ##\begin{pmatrix} 1 & 1 \\ -1 & 1\end{pmatrix}##. Clearly, A and B commute. Now any linear combination of basis states is still an eigenvector of A, but that's not true for B. There are only certain linear combinations that are eigenvectors of B. Those are the simultaneous eigenstates of A and B.

What you have so far is that B maps an eigenvector of A back into the subspace spanned by eigenvectors with the same eigenvalue, so if you wrote down the matrix representing B with respect to the basis consisting of eigenvectors of A, what will the matrix look like?
 
summarizing ardie's suggestion

I took a look at pg. 24 of Ballentine, ardie's suggestion. Hope you don't mind, I've summarized:

Let ##V## be a vector space and ##A,B## linear operators on ##V##. Assume that ##A, B## each have a complete set of eigenvectors -- that is, the eigenvectors of ##A## form a basis of ##V## (and similar for ##B##).

Claim: ##A,B## have a common set of eigenvectors.

Note: In quantum mechanics, this assumption is OK because ##V## is a set of physical states and linear ##A,B## are observables -- assuming ##A,B## each have a complete set of eigenstates is saying that physical states can be written as a linear combo of eigenstates of ##A## (or of eigenstates of ##B##).

Also Note: this claim does not hold if ##A,B## are not linear. For example, if ##V(x)=V^*(x)## then time reversal T (which is anti-linear) commutes with the Hamiltonian H (linear) but this is not enough to guarantee simultaneous eigenstates.

Proof:
Let ##v## be such that
\begin{equation}(A-aI)v=0.\end{equation}
Since ##B## has a complete set, write
\begin{equation}v = w_1 + \cdots + w_n \end{equation}
where each of ##w_i## is an eigenvector of ##B## with eigenvalue ##b_i##. Since a multiple of an eigenvector is an eigenvector, we have for convenience omitted constant multipliers in the linear combo. Also, WLOG we assume that the eigenvalues of the ##w_i##'s are distinct (if they are not, clump together any same-eigenvalue ##w##'s into one big ##w##). Note: ##n\neq dim(V)##.

Combining our two equations gives
\begin{equation}(A-aI)w_1 + \cdots + (A-aI)w_n = 0.\end{equation}
Now the question is whether the terms of this sum are linearly independent ?

Applying ##B## to individual terms in the sum gives
\begin{eqnarray}
B(A-aI)w_i&=&(A-aI)Bw_i\mbox{ using that BA=AB and B linear}\\
&=&(A-aI)b_iw_i\\
&=&b_i(A-aI)w_i\mbox{ using A linear}
\end{eqnarray}
This shows that ##u_i=(A-aI)w_i## are eigenvectors of ##B## with distinct eigenvalues ! Thus ##u_i## are linearly independent.

Each term in the sum then must be zero so that
\begin{equation}(A-aI)w_i=0\end{equation}
showing that ##w_i## is also an eigenvector of ##A##.

Since ##n\neq dim(V)## we did not get all the eigenvectors of ##B##, but just repeat this process for all eigenvectors ##v## of ##A##.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K