# Pauli spin matrices

1. May 1, 2013

### skrat

Show that all hermitian 2x2 matrices with trace 0 are elements of three dimensional vector space in $\mathbb{R}$, which basis vectors are Pauli spin matrices.

Any clues on how to begin? :/

2. May 1, 2013

### micromass

What does it mean that the Pauli matrices are a basis of the hermitian, traceless 2x2 matrices? Consider the definition of a basis.

3. May 1, 2013

### skrat

Basis vectors are linearly independent and any other vector from the same space is a linear combination of basis vectors.
Meaning, according to this definition I have to prove/show, any traceless 2x2 matrix is a linear combination of Pauli matrices. So....?

4. May 1, 2013

### micromass

Indeed. You will have to show two things.

1) The Pauli spin matrices are linearly independent. So, what does this mean? What's the definition?
2) The Pauli spin matrices span the space. So, what's the definition here?

5. May 1, 2013

### skrat

1) I know how to prove that they are linearly independent. But, what does this mean? Can you be more specific? What is your hint here?

3) The fact that matrices are hermitian is not important?

6. May 1, 2013

### skrat

Ok, here is what I found out (I have to write something because forum moderators again gave me a warning, this time for not knowing how to start solving a problem... thanks guys.)

Let's say that Pauli matrices are basis vectors $:=\begin{Bmatrix} \begin{bmatrix} 0 &1 \\ 1& 0 \end{bmatrix},\begin{bmatrix} 0 &-i \\ i &0 \end{bmatrix},\begin{bmatrix} 1 & 0\\ 0 & -1 \end{bmatrix} \end{Bmatrix}$ for vector space $V$. (I checked their linear independence, but there is no need to write the proof here).

Since Pauli matrices are basis vectors for vector space $V$, any other vector from $V$ is a linear combination of basis vectors:

$\alpha \begin{bmatrix} 0 &1 \\ 1& 0 \end{bmatrix}+\beta \begin{bmatrix} 0 &-i \\ i &0 \end{bmatrix}+\gamma \begin{bmatrix} 1 & 0\\ 0 & -1 \end{bmatrix}=\begin{bmatrix} x_{1} & x_{2}\\ x_{3}& x_{4} \end{bmatrix}$, where $\begin{bmatrix} x_{1} & x_{2}\\ x_{3}& x_{4} \end{bmatrix} = X$ is a matrix in $V$. If I sum the left side, than I get something like this$\begin{bmatrix} \gamma & \alpha -\beta i\\ \alpha +\beta i& -\gamma \end{bmatrix}=\begin{bmatrix} x_{1} & x_{2}\\ x_{3}& x_{4} \end{bmatrix}$.

Now we can see that for any $\gamma$ the $X$ is traceless because $trX=\gamma +(-\gamma )=0$ and $X$ is also hermitian (complex conjugation and transponsed):
$X=\begin{bmatrix} \gamma & \alpha -\beta i\\ \alpha +\beta i& -\gamma \end{bmatrix}, X^{H}=\begin{bmatrix} \gamma & \overline{\alpha +\beta i}\\ \overline{\alpha -\beta i}& -\gamma \end{bmatrix}=X^{H}=\begin{bmatrix} \gamma & \alpha -\beta i\\ \alpha +\beta i& -\gamma \end{bmatrix}$, where $\alpha ,\beta ,\gamma \in \mathbb{R}$ so $V$ is also three dimensional or in other words, $dimV=3$

Doesn this sound about right?

7. May 1, 2013

### micromass

But you need to prove this!

8. May 1, 2013

### skrat

Am... Now I am having some troubles understanding this. We often made proofs in two directions:

- one direction would be: If 2x2 matrix is hermitian and traceless than it is a combination of Pauli spin matrices (because they are basis vectors)
- other way: any linear combination of basis vectors (which are pauli spin matrices) is a hermitian and traceless 2x2 matrix.

If I understand you correctly, you want to see this:

$N:=\begin{Bmatrix} \begin{bmatrix} 0 &1 \\ 1& 0 \end{bmatrix},\begin{bmatrix} 0 &-i \\ i& 0 \end{bmatrix},\begin{bmatrix} 1 &0 \\ 0& -1 \end{bmatrix} \end{Bmatrix}$. Where $N$ is linearly independent because $\begin{bmatrix} 0 &1 &1& 0\\ 0 &-i&i& 0\\ 1 &0 &0& -1 \end{bmatrix}\sim \begin{bmatrix} 1 &0 &0& -1\\ 0 &1 &1& 0\\ 0 &-i&i& 0 \end{bmatrix}\sim \begin{bmatrix} 1 &0 &0& -1\\ 0 &1 &1& 0\\ 0 &0&2i& 0 \end{bmatrix}$.

Let vector space $V$ be space of 2x2 hermitian and traceless matrices. Now $dimV=3$ (because the fact that it is traceless takes one dimension) just as $dimN=3$. Both vector spaces have the same dimension and since N is also linearly independent it is basis for vector space $V$

right?

9. May 1, 2013

### Staff: Mentor

That was me, and you're welcome.
The warning was for not including what you've tried, which is a requirement according to the PF rules.

If the statement you're trying to prove has "if and only if" in it, then the proof needs to go both ways. The one in this thread is NOT one of those.
Start with an arbitrary 2 x 2 hermitian matrix (maybe call it H) whose trace is 0. Show that the space of such matrices is three-dimensional, and that a basis for this space is Pauli spin matrices.

10. May 1, 2013

### Fredrik

Staff Emeritus
I want to offer a very general comment here. The statement you want to prove is of the form "For all $x\in V$,..." where V is some set. When the statement begins that way, the proof should usually begin with the words
Let $x\in V$ be arbitrary.​
Obviously you can use another symbol instead of x. The point here is that you should only need to glance at the statement you want to prove to see how the proof should begin. The obvious next step is to use the definition of V. This is what Mark44 is already telling you to do.

Last edited: May 1, 2013
11. May 1, 2013

### Fredrik

Staff Emeritus
I'm not sure what you're doing here, but when you get to the linear independence, you should just work directly with the definition of "linearly independent".

You should let V be the set of complex 2×2 traceless hermitian matrices, and then prove that it's a vector space over ℝ (not over ℂ). How to prove dim V=3 depends on your definition of dim. So how do you define it?

12. May 1, 2013

### skrat

That is something that needs to go on my wall! Thanks!
Well, that is something I (THINK) I already did, didn't I?

So let $X=\begin{bmatrix} x_{1} & x_{2}\\ x_{3} & x_{4} \end{bmatrix}$ be arbitrary matrix and $X\in V$ where $V:=\begin{Bmatrix}H=\begin{bmatrix} h_{1} &h_{2} \\ h_{3}& h_{4} \end{bmatrix},h_{i}\in \mathbb{R} ; tr(H)=0 \end{Bmatrix}$.

$dimV=2$ because $tr(H)=h_{1}+h_{4}=0$ so $h_{1}=-h_{4}$and since $H$ is hermitian $x_{2}=\overline{x_{3}}$ any matrix $X$ from $V$ is than written as $X=\begin{bmatrix} x_{1} &x_{2} \\ x_{3}& x_{4} \end{bmatrix}=\begin{bmatrix} -x_{4} &\overline{x_{3}} \\ x_{3}& x_{4} \end{bmatrix}$. So $X=x_{4}\begin{bmatrix} -1&0 \\ 0&1 \end{bmatrix}+ x_{3}\begin{bmatrix} 0&-i \\ i& 0 \end{bmatrix}$. At this point, I think something is wrong? Is the fact that matrix is hermitian irrelevant for dimension of $V$?

Firstly, thanks for the first general comment. Another one going on my wall of things you must remember.
Back to the topic: Ammm, everything is a bit mixed up. :) In this post I hope to get the right order.

13. May 1, 2013

### Staff: Mentor

The matrix elements are complex, so x3 ≠ x3i.

14. May 1, 2013

### skrat

Indeed an important fact. So:
$dimV=3$ because $tr(H)=h_{1}+h_{4}=0$ so $h_{1}=-h_{4}$and since $H$ is hermitian $x_{2}=\overline{x_{3}}$ any matrix $X$ from $V$ is than written as $X=\begin{bmatrix} x_{1} &x_{2} \\ x_{3}& x_{4} \end{bmatrix}=\begin{bmatrix} -x_{4} &\overline{x_{3}} \\ x_{3}& x_{4} \end{bmatrix}$. So $X=x_{4}\begin{bmatrix} -1&0 \\ 0&1 \end{bmatrix}+ Re(x_{3})\begin{bmatrix} 0&1 \\ 1& 0 \end{bmatrix}+Im(x_{3})\begin{bmatrix} 0&-1 \\ 1& 0 \end{bmatrix}$ (or is it $+Im(x_{3})\begin{bmatrix} 0&-i \\ i& 0 \end{bmatrix}$ ???)

Anyway, $dimV=3$ and since those three vectors are also linearly independent they form basis for V.

15. May 1, 2013

### Staff: Mentor

The latter of the two. You're trying to represent two basis elements for
$$\begin{bmatrix} 0&a_3 - b_3i \\ a_3 + b_3i & 0 \end{bmatrix}$$

where x3 = a3 + b3i

The two matrices could also be represented as
$$a_3\begin{bmatrix} 0 & 1\\ 1 & 0 \end{bmatrix}~\text{and}~ b_3i\begin{bmatrix} 0& -1 \\ 1& 0 \end{bmatrix}$$

16. May 1, 2013

### Fredrik

Staff Emeritus
The result dim V=3 is certainly related to the fact that three real numbers are sufficient to determine all the components of an arbitrary X in V. But to prove that dim V=3, you have to use a definition of "dim".

17. May 2, 2013

### skrat

Ok, how do I than prove that dimV=3?

At my university it was enough if we just wrote that any element from V is linear combination of basis vectors. Number of basis vectors is than equal to dimension of vector space.

So $X=x_{4}\begin{bmatrix} -1&0 \\ 0&1 \end{bmatrix}+ Re(x_{3})\begin{bmatrix} 0&1 \\ 1& 0 \end{bmatrix}+Im(x_{3})\begin{bmatrix} 0&-i \\ i& 0 \end{bmatrix}$

The three matrices here are linear independent, therefore $\begin{bmatrix} -1&0 \\ 0&1 \end{bmatrix}$, $\begin{bmatrix} 0&1 \\ 1& 0 \end{bmatrix}$, $\begin{bmatrix} 0&-i \\ i& 0 \end{bmatrix}$ which also happen to be Pauli spin matrices, form basis for V (V is vector space of 2x2 hermitian and traceless matrices).

18. May 2, 2013

### skrat

BTW, thanks for all the help to all of you!

19. May 2, 2013

### Fredrik

Staff Emeritus
You haven't posted a proof of any of the following: 1. V is a vector space over ℝ. 2. $\{\sigma_1,\sigma_2,\sigma_3\}$ is a basis. (Comment below). 3. dim V=3. Maybe you've done more than you've shown us?

What exactly you need to show for 2 and 3 above depends on your definitions of "basis" and "dimension". That's why I asked for your definition of "dim". I should also have asked for your definition of "basis". Some of the valid definitions of "basis" are: "a maximal linearly independent set", "a minimal spanning set", "a linearly independent spanning set". You have only shown that it's a spanning set. The definition of dim V=3 that I use is that V contains a linearly independent set with cardinality (number of elements) 3 but not a linearly independent set with cardinality 4. Another definition is that dim V=3 if and only if every basis for V has cardinality 3. (If we consider it to be already proved that all bases have the same cardinality, then it's sufficient to verify that the basis we've found has cardinality 3).

So to complete 2 and 3, you at least have to prove linear independence. You also need to think about 1.

Since you have already proved that the sigmas span V, I don't mind showing you the prettiest way to do it: Let $$\begin{bmatrix}a & b\\ c & d\end{bmatrix}\in V$$ be arbitrary. The definition of V implies that Im a = Im d = 0, d=-a, and b=c*. So we can define $x_1,x_2,x_3\in\mathbb R$ by
$$a=x_3,\quad c=x_1+ix_2.$$ We have
$$\begin{bmatrix}a & b\\ c & d\end{bmatrix} =\begin{bmatrix}x_3 & x_1-ix_2\\ x_1+ix_2 & -x_3\end{bmatrix}=\sum_{k=1}^3 x_k\sigma_k.$$ (I see that you've been writing the matrix I call $\sigma_3$ first when you list the spin matrices. If that means that your book calls it $\sigma_1$, then you will have to modify the above slightly).

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted