1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Pauli spin matrices

  1. May 1, 2013 #1
    Show that all hermitian 2x2 matrices with trace 0 are elements of three dimensional vector space in [itex]\mathbb{R}[/itex], which basis vectors are Pauli spin matrices.

    Any clues on how to begin? :/
     
  2. jcsd
  3. May 1, 2013 #2

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    What does it mean that the Pauli matrices are a basis of the hermitian, traceless 2x2 matrices? Consider the definition of a basis.
     
  4. May 1, 2013 #3
    Basis vectors are linearly independent and any other vector from the same space is a linear combination of basis vectors.
    Meaning, according to this definition I have to prove/show, any traceless 2x2 matrix is a linear combination of Pauli matrices. So....?
     
  5. May 1, 2013 #4

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Indeed. You will have to show two things.

    1) The Pauli spin matrices are linearly independent. So, what does this mean? What's the definition?
    2) The Pauli spin matrices span the space. So, what's the definition here?
     
  6. May 1, 2013 #5
    1) I know how to prove that they are linearly independent. But, what does this mean? Can you be more specific? What is your hint here?

    3) The fact that matrices are hermitian is not important?
     
  7. May 1, 2013 #6
    Ok, here is what I found out (I have to write something because forum moderators again gave me a warning, this time for not knowing how to start solving a problem... thanks guys.)

    Let's say that Pauli matrices are basis vectors [itex]:=\begin{Bmatrix}
    \begin{bmatrix}
    0 &1 \\
    1& 0
    \end{bmatrix},\begin{bmatrix}
    0 &-i \\
    i &0
    \end{bmatrix},\begin{bmatrix}
    1 & 0\\
    0 & -1
    \end{bmatrix}
    \end{Bmatrix}[/itex] for vector space [itex]V[/itex]. (I checked their linear independence, but there is no need to write the proof here).

    Since Pauli matrices are basis vectors for vector space [itex]V[/itex], any other vector from [itex]V[/itex] is a linear combination of basis vectors:

    [itex]\alpha \begin{bmatrix}
    0 &1 \\
    1& 0
    \end{bmatrix}+\beta \begin{bmatrix}
    0 &-i \\
    i &0
    \end{bmatrix}+\gamma \begin{bmatrix}
    1 & 0\\
    0 & -1
    \end{bmatrix}=\begin{bmatrix}
    x_{1} & x_{2}\\
    x_{3}& x_{4}
    \end{bmatrix}[/itex], where [itex]\begin{bmatrix}
    x_{1} & x_{2}\\
    x_{3}& x_{4}
    \end{bmatrix} = X[/itex] is a matrix in [itex]V[/itex]. If I sum the left side, than I get something like this[itex] \begin{bmatrix}
    \gamma & \alpha -\beta i\\
    \alpha +\beta i& -\gamma
    \end{bmatrix}=\begin{bmatrix}
    x_{1} & x_{2}\\
    x_{3}& x_{4}
    \end{bmatrix}[/itex].

    Now we can see that for any [itex]\gamma[/itex] the [itex]X[/itex] is traceless because [itex]trX=\gamma +(-\gamma )=0[/itex] and [itex]X[/itex] is also hermitian (complex conjugation and transponsed):
    [itex]X=\begin{bmatrix}
    \gamma & \alpha -\beta i\\
    \alpha +\beta i& -\gamma
    \end{bmatrix}, X^{H}=\begin{bmatrix}
    \gamma & \overline{\alpha +\beta i}\\
    \overline{\alpha -\beta i}& -\gamma
    \end{bmatrix}=X^{H}=\begin{bmatrix}
    \gamma & \alpha -\beta i\\
    \alpha +\beta i& -\gamma
    \end{bmatrix}[/itex], where [itex]\alpha ,\beta ,\gamma \in \mathbb{R}[/itex] so [itex]V[/itex] is also three dimensional or in other words, [itex]dimV=3[/itex]

    Doesn this sound about right?
     
  8. May 1, 2013 #7

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    But you need to prove this!
     
  9. May 1, 2013 #8
    Am... Now I am having some troubles understanding this. We often made proofs in two directions:

    - one direction would be: If 2x2 matrix is hermitian and traceless than it is a combination of Pauli spin matrices (because they are basis vectors)
    - other way: any linear combination of basis vectors (which are pauli spin matrices) is a hermitian and traceless 2x2 matrix.

    If I understand you correctly, you want to see this:

    [itex]N:=\begin{Bmatrix}
    \begin{bmatrix}
    0 &1 \\
    1& 0
    \end{bmatrix},\begin{bmatrix}
    0 &-i \\
    i& 0
    \end{bmatrix},\begin{bmatrix}
    1 &0 \\
    0& -1
    \end{bmatrix}
    \end{Bmatrix}[/itex]. Where [itex]N[/itex] is linearly independent because [itex]\begin{bmatrix}
    0 &1 &1& 0\\
    0 &-i&i& 0\\
    1 &0 &0& -1
    \end{bmatrix}\sim \begin{bmatrix}
    1 &0 &0& -1\\
    0 &1 &1& 0\\
    0 &-i&i& 0
    \end{bmatrix}\sim \begin{bmatrix}
    1 &0 &0& -1\\
    0 &1 &1& 0\\
    0 &0&2i& 0
    \end{bmatrix}[/itex].

    Let vector space [itex]V[/itex] be space of 2x2 hermitian and traceless matrices. Now [itex]dimV=3[/itex] (because the fact that it is traceless takes one dimension) just as [itex]dimN=3[/itex]. Both vector spaces have the same dimension and since N is also linearly independent it is basis for vector space [itex]V[/itex]

    right?
     
  10. May 1, 2013 #9

    Mark44

    Staff: Mentor

    That was me, and you're welcome.
    The warning was for not including what you've tried, which is a requirement according to the PF rules.

    If the statement you're trying to prove has "if and only if" in it, then the proof needs to go both ways. The one in this thread is NOT one of those.
    Start with an arbitrary 2 x 2 hermitian matrix (maybe call it H) whose trace is 0. Show that the space of such matrices is three-dimensional, and that a basis for this space is Pauli spin matrices.
     
  11. May 1, 2013 #10

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I want to offer a very general comment here. The statement you want to prove is of the form "For all ##x\in V##,..." where V is some set. When the statement begins that way, the proof should usually begin with the words
    Let ##x\in V## be arbitrary.​
    Obviously you can use another symbol instead of x. The point here is that you should only need to glance at the statement you want to prove to see how the proof should begin. The obvious next step is to use the definition of V. This is what Mark44 is already telling you to do.
     
    Last edited: May 1, 2013
  12. May 1, 2013 #11

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I'm not sure what you're doing here, but when you get to the linear independence, you should just work directly with the definition of "linearly independent".

    You should let V be the set of complex 2×2 traceless hermitian matrices, and then prove that it's a vector space over ℝ (not over ℂ). How to prove dim V=3 depends on your definition of dim. So how do you define it?
     
  13. May 1, 2013 #12
    That is something that needs to go on my wall! Thanks!
    Well, that is something I (THINK) I already did, didn't I?

    So let [itex]X=\begin{bmatrix}
    x_{1} & x_{2}\\
    x_{3} & x_{4}
    \end{bmatrix}[/itex] be arbitrary matrix and [itex]X\in V[/itex] where [itex]V:=\begin{Bmatrix}H=\begin{bmatrix}
    h_{1} &h_{2} \\
    h_{3}& h_{4}
    \end{bmatrix},h_{i}\in \mathbb{R}
    ; tr(H)=0
    \end{Bmatrix}[/itex].

    [itex]dimV=2[/itex] because [itex]tr(H)=h_{1}+h_{4}=0[/itex] so [itex]h_{1}=-h_{4} [/itex]and since [itex]H[/itex] is hermitian [itex]x_{2}=\overline{x_{3}}[/itex] any matrix [itex]X[/itex] from [itex]V[/itex] is than written as [itex]X=\begin{bmatrix}
    x_{1} &x_{2} \\
    x_{3}& x_{4}
    \end{bmatrix}=\begin{bmatrix}
    -x_{4} &\overline{x_{3}} \\
    x_{3}& x_{4}
    \end{bmatrix}[/itex]. So [itex]X=x_{4}\begin{bmatrix}
    -1&0 \\
    0&1
    \end{bmatrix}+ x_{3}\begin{bmatrix}
    0&-i \\
    i& 0
    \end{bmatrix}[/itex]. At this point, I think something is wrong? Is the fact that matrix is hermitian irrelevant for dimension of [itex]V[/itex]?

    Firstly, thanks for the first general comment. Another one going on my wall of things you must remember.
    Back to the topic: Ammm, everything is a bit mixed up. :) In this post I hope to get the right order.
     
  14. May 1, 2013 #13

    Mark44

    Staff: Mentor

    The matrix elements are complex, so x3 ≠ x3i.
     
  15. May 1, 2013 #14
    Indeed an important fact. So:
    [itex]dimV=3[/itex] because [itex]tr(H)=h_{1}+h_{4}=0[/itex] so [itex]h_{1}=-h_{4} [/itex]and since [itex]H[/itex] is hermitian [itex]x_{2}=\overline{x_{3}}[/itex] any matrix [itex]X[/itex] from [itex]V[/itex] is than written as [itex]X=\begin{bmatrix}
    x_{1} &x_{2} \\
    x_{3}& x_{4}
    \end{bmatrix}=\begin{bmatrix}
    -x_{4} &\overline{x_{3}} \\
    x_{3}& x_{4}
    \end{bmatrix}[/itex]. So [itex]X=x_{4}\begin{bmatrix}
    -1&0 \\
    0&1
    \end{bmatrix}+ Re(x_{3})\begin{bmatrix}
    0&1 \\
    1& 0
    \end{bmatrix}+Im(x_{3})\begin{bmatrix}
    0&-1 \\
    1& 0
    \end{bmatrix}[/itex] (or is it [itex]+Im(x_{3})\begin{bmatrix}
    0&-i \\
    i& 0
    \end{bmatrix}[/itex] ???)

    Anyway, [itex]dimV=3[/itex] and since those three vectors are also linearly independent they form basis for V.
     
  16. May 1, 2013 #15

    Mark44

    Staff: Mentor

    The latter of the two. You're trying to represent two basis elements for
    $$ \begin{bmatrix}
    0&a_3 - b_3i \\
    a_3 + b_3i & 0
    \end{bmatrix}$$

    where x3 = a3 + b3i

    The two matrices could also be represented as
    $$ a_3\begin{bmatrix}
    0 & 1\\
    1 & 0
    \end{bmatrix}~\text{and}~ b_3i\begin{bmatrix}
    0& -1 \\
    1& 0
    \end{bmatrix}$$
     
  17. May 1, 2013 #16

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The result dim V=3 is certainly related to the fact that three real numbers are sufficient to determine all the components of an arbitrary X in V. But to prove that dim V=3, you have to use a definition of "dim".
     
  18. May 2, 2013 #17
    Ok, how do I than prove that dimV=3?

    At my university it was enough if we just wrote that any element from V is linear combination of basis vectors. Number of basis vectors is than equal to dimension of vector space.

    So [itex]X=x_{4}\begin{bmatrix}
    -1&0 \\
    0&1
    \end{bmatrix}+ Re(x_{3})\begin{bmatrix}
    0&1 \\
    1& 0
    \end{bmatrix}+Im(x_{3})\begin{bmatrix}
    0&-i \\
    i& 0
    \end{bmatrix}[/itex]

    The three matrices here are linear independent, therefore [itex]\begin{bmatrix}
    -1&0 \\
    0&1
    \end{bmatrix}[/itex], [itex]\begin{bmatrix}
    0&1 \\
    1& 0
    \end{bmatrix}[/itex], [itex]\begin{bmatrix}
    0&-i \\
    i& 0
    \end{bmatrix}[/itex] which also happen to be Pauli spin matrices, form basis for V (V is vector space of 2x2 hermitian and traceless matrices).
     
  19. May 2, 2013 #18
    BTW, thanks for all the help to all of you!
     
  20. May 2, 2013 #19

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You haven't posted a proof of any of the following: 1. V is a vector space over ℝ. 2. ##\{\sigma_1,\sigma_2,\sigma_3\}## is a basis. (Comment below). 3. dim V=3. Maybe you've done more than you've shown us?

    What exactly you need to show for 2 and 3 above depends on your definitions of "basis" and "dimension". That's why I asked for your definition of "dim". I should also have asked for your definition of "basis". Some of the valid definitions of "basis" are: "a maximal linearly independent set", "a minimal spanning set", "a linearly independent spanning set". You have only shown that it's a spanning set. The definition of dim V=3 that I use is that V contains a linearly independent set with cardinality (number of elements) 3 but not a linearly independent set with cardinality 4. Another definition is that dim V=3 if and only if every basis for V has cardinality 3. (If we consider it to be already proved that all bases have the same cardinality, then it's sufficient to verify that the basis we've found has cardinality 3).

    So to complete 2 and 3, you at least have to prove linear independence. You also need to think about 1.

    Since you have already proved that the sigmas span V, I don't mind showing you the prettiest way to do it: Let $$\begin{bmatrix}a & b\\ c & d\end{bmatrix}\in V$$ be arbitrary. The definition of V implies that Im a = Im d = 0, d=-a, and b=c*. So we can define ##x_1,x_2,x_3\in\mathbb R## by
    $$a=x_3,\quad c=x_1+ix_2.$$ We have
    $$\begin{bmatrix}a & b\\ c & d\end{bmatrix} =\begin{bmatrix}x_3 & x_1-ix_2\\ x_1+ix_2 & -x_3\end{bmatrix}=\sum_{k=1}^3 x_k\sigma_k.$$ (I see that you've been writing the matrix I call ##\sigma_3## first when you list the spin matrices. If that means that your book calls it ##\sigma_1##, then you will have to modify the above slightly).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Pauli spin matrices
  1. The Matrices (Replies: 12)

Loading...