1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Quantum Mechanics: Matrices

  1. Feb 17, 2013 #1
    1. The problem statement, all variables and given/known data

    The problem is I am unable to understand the proof. I understand how it is done but I do not know how it is related to the theorem. It is probably because I am unable to understand the notation of matrices, the one involving k.
    It is given that
    = 1 0 0...0
    0 1 0...0
    So now how do I relate k with it?

    So before you explain to me whats going on, please elaborate on the notation.
    Thank you.
    1. The problem statement, all variables and given/known data

    2. Relevant equations

    3. The attempt at a solution

    Attached Files:

  2. jcsd
  3. Feb 17, 2013 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    If you mean that you don't understand how to go from the first line to the second, all they've done is to insert the identity operator in the form ##1=\sum_k|k\rangle\langle k|##. If you mean that you don't understand how to go from the second row to the third, then you need to learn about the relationship between linear operators and matrices.

    Let U and V be vector spaces. Let ##T:U\to V## be linear. Let ##A=(u_1,\dots,u_n)## and ##B=(v_1,\dots,v_m)## be ordered bases for U and V respectively. The matrix [T] of the linear operator T with respect to the pair (A,B) of ordered bases is defined by
    $$[T]_{ij}=(Tu_j)_i.$$ The right-hand side is interpreted as "the ith component of the vector ##Tu_j## in the ordered basis B".

    You should find it very easy to verify that if B is an orthonormal ordered basis, we have ##[T]_{ij}=\langle v_i,Tu_j\rangle##.

    The reason for the definition of [T] can be seen by doing a simple calculation. Suppose that Tx=y. I won't write any summation sigmas, since we can remember to do a sum over each index that appears twice.
    $$\begin{align}y &=y_i v_i\\
    Tx &= T(x_j u_j)=x_jT(u_j)=x_j(Tu_j)_i v_i.\end{align}$$ Since the v_i are linearly independent, this implies that ##y_i=(Tu_j)_i x_j##. This can be interpreted as a matrix equation [y]=[T][x], if we define [y] and [x] in the obvious ways, and [T] as above. (Recall that the definition of matrix multiplication is ##(AB)_{ij}=A_{ik}B_{kj}##).

    When U=V, it's convenient to choose A=B, and we can talk about the matrix of a linear operator with respect to an ordered basis, instead of with respect to a pair of ordered bases.

    Notations like [T] are typically only used in explanations like this. I think most books would use T both for the linear operator and the corresponding matrix with respect to a pair of ordered bases.

    Edit: It would be a good exercise for you to prove that if ##T:U\to V## and ##S:V\to W## are linear, then ##[S\circ T]=[T]##. This result is the main reason why matrix multiplication is defined the way it is.
    Last edited: Feb 17, 2013
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Quantum Mechanics: Matrices
  1. Quantum mechanics (Replies: 0)

  2. Quantum Mechanics (Replies: 6)

  3. Quantum mechanics (Replies: 3)