# Finding an orthonormal basis of a Hilbert space relative to a lattice of subspaces

1. Jul 30, 2010

I have a Hilbert space H; given a closed subspace U of H let PU denote the orthogonal projection onto U. I also have a lattice L of closed subspaces of H, such that for all U and U' in L, PU and PU' commute. The problem is to find an orthonormal basis B of H, such that for every element b of B and every element U of L, b is an eigenvector of PU (equivalently, b is in U or U).

The obvious thing to do is to apply Zorn's lemma to obtain a maximal orthonormal subset B of H satisfying the above condition, and this part works. For some reason or other, though, I'm having trouble showing that span B = H. If not, then letting W = span B, I need to find a normalized vector v in W such that for every U in L, v is an eigenvector of PU; then B ∪ {v} contradicts the maximality of B. (The following may or may not be helpful: It suffices to consider the case where U contains W.)

The idea I have right now is this: Suppose I could find a one-dimensional subspace V of W such that PV commutes with PU for all U in L, and let v be a normalized vector in V. Then for every U in L, PVPU(v) = PUPV(v) = PU(v), so PU(v) is in V. Since V is 1-dimensional, v is an eigenvector of PU(v), as desired.

The problem is that I have no idea how to choose V. I feel like this should be really easy, but for some reason I'm not seeing it.

2. Aug 3, 2010

### Hawkeye18

Re: Finding an orthonormal basis of a Hilbert space relative to a lattice of subspace

It is impossible, here is a counterexample:

Let your Hilbert space be $$L^2([0,1])$$, and let you lattice of subspaces is parametrized by $$a\in[0,1]$$, namely

$$U_a = \{ f\in L^2([0,1]) : f(x) =0\ \forall x>a\}$$

It is easy to see that the projections $$P_{U_a}$$ do not have common eigenvectors (because $$\displaystyle\cap_{a\in (0,1)}U_a =\varnothing$$)

3. Aug 4, 2010