Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Finding an orthonormal basis of a Hilbert space relative to a lattice of subspaces

  1. Jul 30, 2010 #1
    I have a Hilbert space H; given a closed subspace U of H let PU denote the orthogonal projection onto U. I also have a lattice L of closed subspaces of H, such that for all U and U' in L, PU and PU' commute. The problem is to find an orthonormal basis B of H, such that for every element b of B and every element U of L, b is an eigenvector of PU (equivalently, b is in U or U).

    The obvious thing to do is to apply Zorn's lemma to obtain a maximal orthonormal subset B of H satisfying the above condition, and this part works. For some reason or other, though, I'm having trouble showing that span B = H. If not, then letting W = span B, I need to find a normalized vector v in W such that for every U in L, v is an eigenvector of PU; then B ∪ {v} contradicts the maximality of B. (The following may or may not be helpful: It suffices to consider the case where U contains W.)

    The idea I have right now is this: Suppose I could find a one-dimensional subspace V of W such that PV commutes with PU for all U in L, and let v be a normalized vector in V. Then for every U in L, PVPU(v) = PUPV(v) = PU(v), so PU(v) is in V. Since V is 1-dimensional, v is an eigenvector of PU(v), as desired.

    The problem is that I have no idea how to choose V. I feel like this should be really easy, but for some reason I'm not seeing it.
     
  2. jcsd
  3. Aug 3, 2010 #2
    Re: Finding an orthonormal basis of a Hilbert space relative to a lattice of subspace

    It is impossible, here is a counterexample:

    Let your Hilbert space be [tex]L^2([0,1])[/tex], and let you lattice of subspaces is parametrized by [tex]a\in[0,1][/tex], namely

    [tex] U_a = \{ f\in L^2([0,1]) : f(x) =0\ \forall x>a\}[/tex]

    It is easy to see that the projections [tex]P_{U_a}[/tex] do not have common eigenvectors (because [tex]\displaystyle\cap_{a\in (0,1)}U_a =\varnothing[/tex])
     
  4. Aug 4, 2010 #3
    Re: Finding an orthonormal basis of a Hilbert space relative to a lattice of subspace

    Interesting, so it's impossible even if H is separable. Thanks.

    I was more generally considering normal operators, rather than projections (although the case of normal operators can be reduced to the above problem). Wikipedia says it works if the operators are compact, or if one of them is compact and injective. I'll have to check if this is the case for the application I had in mind.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Finding an orthonormal basis of a Hilbert space relative to a lattice of subspaces
Loading...