Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Need help for solving an operator equation

  1. Jul 14, 2011 #1
    I am working on something and have lead to a problem. I need your help!
    Let A and B be self-adjoint operators acting on a finite dimensional Hilbert space. Then, the equation

    A + [ B , X ] = 0,
    has at least one solution X, iff Tr(A)=0.
    ([ B , X ] = BX - XB)

    I have proved it by taking the matrix form of the operators.
    But my question is about countably infinite dimensional Hilbert spaces. There the Trace condition is irrelevant. But will there always be a solution?
    I would appreciate your guidance.

  2. jcsd
  3. Jul 15, 2011 #2


    User Avatar
    Science Advisor

    (First: it's easy to show that X is anti-Hermitian). Here's a suggestion: use recursion. To begin with, write the solution you already have for finite dimensions to make the recursion explicit. That is, write the NxN matrix in block form, splitting off the first row and column and leaving the other N-1 dimensions in a block. Your unknowns will be similarly split: X00, X0i = - Xi0* and Xij = - Xji*. It helps to suppress the indices and consider these quantities to be a scalar, vector and tensor, respectively.

    Then you will have three equations for these three unknowns. For example the first (00) equation will read X·B + XB* = A, and it's easy to show this can always be satisfied. (You can also assume wlog that A is already diagonal.)

    If you can do the above, the idea is to recurse N times, each time block-splitting the Aij further by peeling off the first dimension, etc. Then, to prove it for infinite dimensions, all you have to do is forget to stop recursing. :smile:
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook