Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Matrix operators Dirac notation

  1. Jul 13, 2012 #1
    I'm having trouble seeing how an operator can be written in matrix representation.

    In Sakurai, for an operator X, we have:

    X = [itex]\sum[/itex] [itex]\sum[/itex] |a''> <a''| X |a'> <a'|

    since of course [itex]\sum[/itex] |a> <a| is equal to one.

    Somehow, this all gets multiplied out and you get a square matrix with the elements of form
    <a''| X |a'> ... where a'' correspond with rows and a' columns.

    For the part at the middle of the expression for X at the top, which is just an inner product, and hence a number, we can move it through the rest of the expression. So then, we can just work out

    [itex]\sum[/itex] [itex]\sum[/itex] |a''> <a'| ( an outer product) which will be a square matrix, and multiply each element of this matrix by the corresponding number from the inner product. But when working out this matrix, it turns out to be a diagonal matrix if we are talking about orthogonal eigenvectors (Sakurai has elements off the diagonal?).

    What's more, how do we interpret multiplication of two state functions? Since the bras and kets are nothing more that column and row matrix representations of the state functions, and we are not talking about inner products here (which would be zero of course), just normal multiplication, I don't know how to interpret simply multiplying two state functions together.

    Anyway, I cant get out what Sakurai has written out nicely. Anyone care to show me how to work out the expression at the very top? Thanks!
  2. jcsd
  3. Jul 13, 2012 #2


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    You have it all there! Since the [itex]|a \rangle[/itex] are a complete orthonormal system, you get
    [tex]X_{aa'}=\langle a |\hat{X} a' \rangle.[/tex]
    Applied to an arbitrary ket you find
    [tex]\hat{X} |\psi \rangle=\sum_{aa'} |a \rangle \langle a|\hat{X} a' \rangle \langle a'|\psi \rangle =\sum_{a a'} |a \rangle X_{a a'} \psi_{a'}.[/tex]
    Thus you get in the representation wrt. this orthonormal system
    [tex](\hat{X} \psi)_{a}=\sum_{a'} X_{aa'} \psi_{a'},[/tex]
    which is the usual rule for the multiplication of a square matrix with a column vector in a complex vector space. The only difference is that your qm. Hilbert space usually is infinitely dimensional.
  4. Jul 13, 2012 #3


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Consider the equality |u>=X|v> where |u> and |v> are arbitrary kets and X is an arbitrary linear operator. You know that this equality implies that ##\langle a'|u\rangle=\sum_{a''}\langle a'|X|a''\rangle\langle a''|v\rangle##. If you just compare this to the definition of matrix multiplication, ##(AB)_{ij}=\sum_k A_{ik}B_{kj}##, you will see that it can be interpreted as a component of a matrix equality. This is the reason why the correspondence between matrices and linear operators is given by the first equality in Vanhees71's post.

    Actually, it's easier to understand this if we don't use bra-ket notation, so I suggest that you also read this old post. (Ignore the quote and the stuff below it).
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook