Proving simultaneous eigenvectors for commuting operators

1. Dec 20, 2011

chrisd

1. The problem statement, all variables and given/known data
In my quantum class we learned that if two operators commute, we can always find a set of simultaneous eigenvectors for both operators. I'm having trouble proving this for the case of degenerate eigenvalues.

2. Relevant equations
Commutator: $[A,B]=AB-BA$
Eigenvalue equation:$A \mid v \rangle = a \mid v \rangle$

3. The attempt at a solution
Start off by assuming operators A and B commute so AB=BA.
I think I have the proof for non-degenerate eigenvalues correct:
$A \mid v \rangle = a \mid v \rangle$
$BA \mid v \rangle = Ba \mid v \rangle$
$A(B \mid v \rangle) = a(B \mid v \rangle)$

So $B \mid v \rangle$ is also an eigenvector of A associated with eigenvalue a.
If a is non-degenerate, $B \mid v \rangle$ must be the same eigenvector as $\mid v \rangle$, only multiplied by a scalar.

$B \mid v\rangle=b\mid v \rangle$ which is just the eigenvalue equation for $B$.

For the degenerate case I'm stuck. I can prove that if $\mid v_1 \rangle,\mid v_2 \rangle,...,\mid v_n \rangle$ are eigenvectors of A associated with eigenvalue a, any linear combination of these eigenvectors is also an eigenvector of A with eigenvalue a.
$A (c_1\mid v_1 \rangle + c_2\mid v_2 \rangle)$
$=c_1A\mid v_1 \rangle + c_2A\mid v_2 \rangle$
$=c_1a\mid v_1 \rangle + c_2a\mid v_2 \rangle$
$=a(c_1\mid v_1 \rangle + c_2\mid v_2 \rangle)$

and so from the fact that $B \mid v_i \rangle$ is an eigenvector of A, $B \mid v_i \rangle$ is a linear combination of the eigenvectors of A associated with eigenvalue a...

$B \mid v_i \rangle= c_1\mid v_1 \rangle + c_2\mid v_2 \rangle + ... + c_n\mid v_n \rangle$

so I know $B \mid v_i \rangle$ exists within the eigenspace of a, but I'm not sure how to use this to prove that $\mid v_i \rangle$ is an eigenvector of B.

Any hints would be appreciated.

2. Dec 20, 2011

ardie

let A and B have a complete set of eigenvectors. expand any eigenvector of A in terms of the set of eigenvectors of B. act on this result with B, and you will find that the set of eigenfunctions of B is an eigenfunction for B acting on any eigenvector of A. The coefficients can be degenerate in this proof without any burden on the generality.
alternatively read page 24 of Quantum Mechanics by Ballentine

3. Dec 21, 2011

vela

Staff Emeritus
You can't prove that because it's generally not true for an arbitrary eigenvector of A with eigenvalue a. For example, suppose in some basis, A is represented by the matrix $\begin{pmatrix} a & 0 \\ 0 & a \end{pmatrix}$ and B is represented by the matrix $\begin{pmatrix} 1 & 1 \\ -1 & 1\end{pmatrix}$. Clearly, A and B commute. Now any linear combination of basis states is still an eigenvector of A, but that's not true for B. There are only certain linear combinations that are eigenvectors of B. Those are the simultaneous eigenstates of A and B.

What you have so far is that B maps an eigenvector of A back into the subspace spanned by eigenvectors with the same eigenvalue, so if you wrote down the matrix representing B with respect to the basis consisting of eigenvectors of A, what will the matrix look like?

4. May 31, 2013

jujutsuka

summarizing ardie's suggestion

I took a look at pg. 24 of Ballentine, ardie's suggestion. Hope you don't mind, I've summarized:

Let $V$ be a vector space and $A,B$ linear operators on $V$. Assume that $A, B$ each have a complete set of eigenvectors -- that is, the eigenvectors of $A$ form a basis of $V$ (and similar for $B$).

Claim: $A,B$ have a common set of eigenvectors.

Note: In quantum mechanics, this assumption is OK because $V$ is a set of physical states and linear $A,B$ are observables -- assuming $A,B$ each have a complete set of eigenstates is saying that physical states can be written as a linear combo of eigenstates of $A$ (or of eigenstates of $B$).

Also Note: this claim does not hold if $A,B$ are not linear. For example, if $V(x)=V^*(x)$ then time reversal T (which is anti-linear) commutes with the Hamiltonian H (linear) but this is not enough to guarantee simultaneous eigenstates.

Proof:
Let $v$ be such that
$$(A-aI)v=0.$$
Since $B$ has a complete set, write
$$v = w_1 + \cdots + w_n$$
where each of $w_i$ is an eigenvector of $B$ with eigenvalue $b_i$. Since a multiple of an eigenvector is an eigenvector, we have for convenience omitted constant multipliers in the linear combo. Also, WLOG we assume that the eigenvalues of the $w_i$'s are distinct (if they are not, clump together any same-eigenvalue $w$'s into one big $w$). Note: $n\neq dim(V)$.

Combining our two equations gives
$$(A-aI)w_1 + \cdots + (A-aI)w_n = 0.$$
Now the question is whether the terms of this sum are linearly independent ?

Applying $B$ to individual terms in the sum gives
\begin{eqnarray}
B(A-aI)w_i&=&(A-aI)Bw_i\mbox{ using that BA=AB and B linear}\\
&=&(A-aI)b_iw_i\\
&=&b_i(A-aI)w_i\mbox{ using A linear}
\end{eqnarray}
This shows that $u_i=(A-aI)w_i$ are eigenvectors of $B$ with distinct eigenvalues ! Thus $u_i$ are linearly independent.

Each term in the sum then must be zero so that
$$(A-aI)w_i=0$$
showing that $w_i$ is also an eigenvector of $A$.

Since $n\neq dim(V)$ we did not get all the eigenvectors of $B$, but just repeat this process for all eigenvectors $v$ of $A$.