Pauli Spin Matrices: Showing Trace 0 Hermitian 2x2 Matrices in 3D Space

Click For Summary
SUMMARY

All Hermitian 2x2 matrices with trace 0 are elements of a three-dimensional vector space over the reals, with the basis vectors being the Pauli spin matrices: σ₁ = \(\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}\), σ₂ = \(\begin{bmatrix} 0 & -i \\ i & 0 \end{bmatrix}\), and σ₃ = \(\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}\). The proof requires demonstrating that these matrices are linearly independent and span the space of traceless Hermitian matrices. The dimension of this vector space is confirmed to be 3, as the trace condition reduces the degrees of freedom by one.

PREREQUISITES
  • Understanding of Hermitian matrices
  • Knowledge of linear independence and spanning sets
  • Familiarity with the Pauli spin matrices
  • Basic concepts of vector spaces and dimensions
NEXT STEPS
  • Study the properties of Hermitian matrices in quantum mechanics
  • Learn about linear transformations and their representations using matrices
  • Explore the concept of vector spaces and their dimensions in linear algebra
  • Investigate the role of Pauli matrices in quantum computing and spin systems
USEFUL FOR

Students and professionals in physics, particularly those focusing on quantum mechanics, as well as mathematicians and anyone interested in linear algebra and matrix theory.

skrat
Messages
740
Reaction score
8
Show that all hermitian 2x2 matrices with trace 0 are elements of three dimensional vector space in \mathbb{R}, which basis vectors are Pauli spin matrices.

Any clues on how to begin? :/
 
Physics news on Phys.org
What does it mean that the Pauli matrices are a basis of the hermitian, traceless 2x2 matrices? Consider the definition of a basis.
 
Basis vectors are linearly independent and any other vector from the same space is a linear combination of basis vectors.
Meaning, according to this definition I have to prove/show, any traceless 2x2 matrix is a linear combination of Pauli matrices. So...?
 
Indeed. You will have to show two things.

1) The Pauli spin matrices are linearly independent. So, what does this mean? What's the definition?
2) The Pauli spin matrices span the space. So, what's the definition here?
 
micromass said:
Indeed. You will have to show two things.

1) The Pauli spin matrices are linearly independent. So, what does this mean? What's the definition?
2) The Pauli spin matrices span the space. So, what's the definition here?

1) I know how to prove that they are linearly independent. But, what does this mean? Can you be more specific? What is your hint here?

3) The fact that matrices are hermitian is not important?
 
Ok, here is what I found out (I have to write something because forum moderators again gave me a warning, this time for not knowing how to start solving a problem... thanks guys.)

Let's say that Pauli matrices are basis vectors :=\begin{Bmatrix}<br /> \begin{bmatrix}<br /> 0 &amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix},\begin{bmatrix}<br /> 0 &amp;-i \\ <br /> i &amp;0 <br /> \end{bmatrix},\begin{bmatrix}<br /> 1 &amp; 0\\ <br /> 0 &amp; -1<br /> \end{bmatrix}<br /> \end{Bmatrix} for vector space V. (I checked their linear independence, but there is no need to write the proof here).

Since Pauli matrices are basis vectors for vector space V, any other vector from V is a linear combination of basis vectors:

\alpha \begin{bmatrix}<br /> 0 &amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix}+\beta \begin{bmatrix}<br /> 0 &amp;-i \\ <br /> i &amp;0 <br /> \end{bmatrix}+\gamma \begin{bmatrix}<br /> 1 &amp; 0\\ <br /> 0 &amp; -1<br /> \end{bmatrix}=\begin{bmatrix}<br /> x_{1} &amp; x_{2}\\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}, where \begin{bmatrix}<br /> x_{1} &amp; x_{2}\\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix} = X is a matrix in V. If I sum the left side, than I get something like this\begin{bmatrix}<br /> \gamma &amp; \alpha -\beta i\\ <br /> \alpha +\beta i&amp; -\gamma <br /> \end{bmatrix}=\begin{bmatrix}<br /> x_{1} &amp; x_{2}\\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}.

Now we can see that for any \gamma the X is traceless because trX=\gamma +(-\gamma )=0 and X is also hermitian (complex conjugation and transponsed):
X=\begin{bmatrix}<br /> \gamma &amp; \alpha -\beta i\\ <br /> \alpha +\beta i&amp; -\gamma <br /> \end{bmatrix}, X^{H}=\begin{bmatrix}<br /> \gamma &amp; \overline{\alpha +\beta i}\\ <br /> \overline{\alpha -\beta i}&amp; -\gamma <br /> \end{bmatrix}=X^{H}=\begin{bmatrix}<br /> \gamma &amp; \alpha -\beta i\\ <br /> \alpha +\beta i&amp; -\gamma <br /> \end{bmatrix}, where \alpha ,\beta ,\gamma \in \mathbb{R} so V is also three dimensional or in other words, dimV=3

Doesn this sound about right?
 
skrat said:
Since Pauli matrices are basis vectors for vector space V

But you need to prove this!
 
micromass said:
But you need to prove this!

Am... Now I am having some troubles understanding this. We often made proofs in two directions:

- one direction would be: If 2x2 matrix is hermitian and traceless than it is a combination of Pauli spin matrices (because they are basis vectors)
- other way: any linear combination of basis vectors (which are pauli spin matrices) is a hermitian and traceless 2x2 matrix.

If I understand you correctly, you want to see this:

N:=\begin{Bmatrix}<br /> \begin{bmatrix}<br /> 0 &amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix},\begin{bmatrix}<br /> 0 &amp;-i \\ <br /> i&amp; 0<br /> \end{bmatrix},\begin{bmatrix}<br /> 1 &amp;0 \\ <br /> 0&amp; -1<br /> \end{bmatrix}<br /> \end{Bmatrix}. Where N is linearly independent because \begin{bmatrix}<br /> 0 &amp;1 &amp;1&amp; 0\\<br /> 0 &amp;-i&amp;i&amp; 0\\<br /> 1 &amp;0 &amp;0&amp; -1<br /> \end{bmatrix}\sim \begin{bmatrix}<br /> 1 &amp;0 &amp;0&amp; -1\\<br /> 0 &amp;1 &amp;1&amp; 0\\<br /> 0 &amp;-i&amp;i&amp; 0<br /> \end{bmatrix}\sim \begin{bmatrix}<br /> 1 &amp;0 &amp;0&amp; -1\\<br /> 0 &amp;1 &amp;1&amp; 0\\<br /> 0 &amp;0&amp;2i&amp; 0<br /> \end{bmatrix}.

Let vector space V be space of 2x2 hermitian and traceless matrices. Now dimV=3 (because the fact that it is traceless takes one dimension) just as dimN=3. Both vector spaces have the same dimension and since N is also linearly independent it is basis for vector space V

right?
 
skrat said:
Ok, here is what I found out (I have to write something because forum moderators again gave me a warning, this time for not knowing how to start solving a problem... thanks guys.)
That was me, and you're welcome.
The warning was for not including what you've tried, which is a requirement according to the PF rules.

skrat said:
Am... Now I am having some troubles understanding this. We often made proofs in two directions:

- one direction would be: If 2x2 matrix is hermitian and traceless than it is a combination of Pauli spin matrices (because they are basis vectors)
- other way: any linear combination of basis vectors (which are pauli spin matrices) is a hermitian and traceless 2x2 matrix.

If the statement you're trying to prove has "if and only if" in it, then the proof needs to go both ways. The one in this thread is NOT one of those.
skrat said:
Show that all hermitian 2x2 matrices with trace 0 are elements of three dimensional vector space in R , which basis vectors are Pauli spin matrices.
Start with an arbitrary 2 x 2 hermitian matrix (maybe call it H) whose trace is 0. Show that the space of such matrices is three-dimensional, and that a basis for this space is Pauli spin matrices.
 
  • #10
I want to offer a very general comment here. The statement you want to prove is of the form "For all ##x\in V##,..." where V is some set. When the statement begins that way, the proof should usually begin with the words
Let ##x\in V## be arbitrary.​
Obviously you can use another symbol instead of x. The point here is that you should only need to glance at the statement you want to prove to see how the proof should begin. The obvious next step is to use the definition of V. This is what Mark44 is already telling you to do.
 
Last edited:
  • #11
skrat said:
Where N is linearly independent because \begin{bmatrix}<br /> 0 &amp;1 &amp;1&amp; 0\\<br /> 0 &amp;-i&amp;i&amp; 0\\<br /> 1 &amp;0 &amp;0&amp; -1<br /> \end{bmatrix}\sim \begin{bmatrix}<br /> 1 &amp;0 &amp;0&amp; -1\\<br /> 0 &amp;1 &amp;1&amp; 0\\<br /> 0 &amp;-i&amp;i&amp; 0<br /> \end{bmatrix}\sim \begin{bmatrix}<br /> 1 &amp;0 &amp;0&amp; -1\\<br /> 0 &amp;1 &amp;1&amp; 0\\<br /> 0 &amp;0&amp;2i&amp; 0<br /> \end{bmatrix}.
I'm not sure what you're doing here, but when you get to the linear independence, you should just work directly with the definition of "linearly independent".

skrat said:
Let vector space V be space of 2x2 hermitian and traceless matrices. Now dimV=3 (because the fact that it is traceless takes one dimension) just as dimN=3. Both vector spaces have the same dimension and since N is also linearly independent it is basis for vector space V
You should let V be the set of complex 2×2 traceless hermitian matrices, and then prove that it's a vector space over ℝ (not over ℂ). How to prove dim V=3 depends on your definition of dim. So how do you define it?
 
  • #12
Mark44 said:
If the statement you're trying to prove has "if and only if" in it, then the proof needs to go both ways. The one in this thread is NOT one of those.
That is something that needs to go on my wall! Thanks!
Mark44 said:
Start with an arbitrary 2 x 2 hermitian matrix (maybe call it H) whose trace is 0. Show that the space of such matrices is three-dimensional, and that a basis for this space is Pauli spin matrices.

Well, that is something I (THINK) I already did, didn't I?

So let X=\begin{bmatrix}<br /> x_{1} &amp; x_{2}\\ <br /> x_{3} &amp; x_{4}<br /> \end{bmatrix} be arbitrary matrix and X\in V where V:=\begin{Bmatrix}H=\begin{bmatrix}<br /> h_{1} &amp;h_{2} \\ <br /> h_{3}&amp; h_{4}<br /> \end{bmatrix},h_{i}\in \mathbb{R}<br /> ; tr(H)=0<br /> \end{Bmatrix}.

dimV=2 because tr(H)=h_{1}+h_{4}=0 so h_{1}=-h_{4}and since H is hermitian x_{2}=\overline{x_{3}} any matrix X from V is than written as X=\begin{bmatrix}<br /> x_{1} &amp;x_{2} \\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}=\begin{bmatrix}<br /> -x_{4} &amp;\overline{x_{3}} \\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}. So X=x_{4}\begin{bmatrix}<br /> -1&amp;0 \\ <br /> 0&amp;1<br /> \end{bmatrix}+ x_{3}\begin{bmatrix}<br /> 0&amp;-i \\ <br /> i&amp; 0<br /> \end{bmatrix}. At this point, I think something is wrong? Is the fact that matrix is hermitian irrelevant for dimension of V?

Fredrik said:
I'm not sure what you're doing here, but when you get to the linear independence, you should just work directly with the definition of "linearly independent".
Firstly, thanks for the first general comment. Another one going on my wall of things you must remember.
Back to the topic: Ammm, everything is a bit mixed up. :) In this post I hope to get the right order.
 
  • #13
The matrix elements are complex, so x3 ≠ x3i.
 
  • #14
Mark44 said:
The matrix elements are complex, so x3 ≠ x3i.

Indeed an important fact. So:
dimV=3 because tr(H)=h_{1}+h_{4}=0 so h_{1}=-h_{4}and since H is hermitian x_{2}=\overline{x_{3}} any matrix X from V is than written as X=\begin{bmatrix}<br /> x_{1} &amp;x_{2} \\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}=\begin{bmatrix}<br /> -x_{4} &amp;\overline{x_{3}} \\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}. So X=x_{4}\begin{bmatrix}<br /> -1&amp;0 \\ <br /> 0&amp;1<br /> \end{bmatrix}+ Re(x_{3})\begin{bmatrix}<br /> 0&amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix}+Im(x_{3})\begin{bmatrix}<br /> 0&amp;-1 \\ <br /> 1&amp; 0<br /> \end{bmatrix} (or is it +Im(x_{3})\begin{bmatrix}<br /> 0&amp;-i \\ <br /> i&amp; 0<br /> \end{bmatrix} ?)

Anyway, dimV=3 and since those three vectors are also linearly independent they form basis for V.
 
  • #15
skrat said:
Indeed an important fact. So:
dimV=3 because tr(H)=h_{1}+h_{4}=0 so h_{1}=-h_{4}and since H is hermitian x_{2}=\overline{x_{3}} any matrix X from V is than written as X=\begin{bmatrix}<br /> x_{1} &amp;x_{2} \\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}=\begin{bmatrix}<br /> -x_{4} &amp;\overline{x_{3}} \\ <br /> x_{3}&amp; x_{4}<br /> \end{bmatrix}. So X=x_{4}\begin{bmatrix}<br /> -1&amp;0 \\ <br /> 0&amp;1<br /> \end{bmatrix}+ Re(x_{3})\begin{bmatrix}<br /> 0&amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix}+Im(x_{3})\begin{bmatrix}<br /> 0&amp;-1 \\ <br /> 1&amp; 0<br /> \end{bmatrix} (or is it +Im(x_{3})\begin{bmatrix}<br /> 0&amp;-i \\ <br /> i&amp; 0<br /> \end{bmatrix} ?)
The latter of the two. You're trying to represent two basis elements for
$$ \begin{bmatrix}
0&a_3 - b_3i \\
a_3 + b_3i & 0
\end{bmatrix}$$

where x3 = a3 + b3i

The two matrices could also be represented as
$$ a_3\begin{bmatrix}
0 & 1\\
1 & 0
\end{bmatrix}~\text{and}~ b_3i\begin{bmatrix}
0& -1 \\
1& 0
\end{bmatrix}$$
skrat said:
Anyway, dimV=3 and since those three vectors are also linearly independent they form basis for V.
 
  • #16
The result dim V=3 is certainly related to the fact that three real numbers are sufficient to determine all the components of an arbitrary X in V. But to prove that dim V=3, you have to use a definition of "dim".
 
  • #17
Fredrik said:
The result dim V=3 is certainly related to the fact that three real numbers are sufficient to determine all the components of an arbitrary X in V. But to prove that dim V=3, you have to use a definition of "dim".

Ok, how do I than prove that dimV=3?

At my university it was enough if we just wrote that any element from V is linear combination of basis vectors. Number of basis vectors is than equal to dimension of vector space.

So X=x_{4}\begin{bmatrix}<br /> -1&amp;0 \\ <br /> 0&amp;1<br /> \end{bmatrix}+ Re(x_{3})\begin{bmatrix}<br /> 0&amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix}+Im(x_{3})\begin{bmatrix}<br /> 0&amp;-i \\ <br /> i&amp; 0<br /> \end{bmatrix}

The three matrices here are linear independent, therefore \begin{bmatrix}<br /> -1&amp;0 \\ <br /> 0&amp;1<br /> \end{bmatrix}, \begin{bmatrix}<br /> 0&amp;1 \\ <br /> 1&amp; 0<br /> \end{bmatrix}, \begin{bmatrix}<br /> 0&amp;-i \\ <br /> i&amp; 0<br /> \end{bmatrix} which also happen to be Pauli spin matrices, form basis for V (V is vector space of 2x2 hermitian and traceless matrices).
 
  • #18
BTW, thanks for all the help to all of you!
 
  • #19
You haven't posted a proof of any of the following: 1. V is a vector space over ℝ. 2. ##\{\sigma_1,\sigma_2,\sigma_3\}## is a basis. (Comment below). 3. dim V=3. Maybe you've done more than you've shown us?

What exactly you need to show for 2 and 3 above depends on your definitions of "basis" and "dimension". That's why I asked for your definition of "dim". I should also have asked for your definition of "basis". Some of the valid definitions of "basis" are: "a maximal linearly independent set", "a minimal spanning set", "a linearly independent spanning set". You have only shown that it's a spanning set. The definition of dim V=3 that I use is that V contains a linearly independent set with cardinality (number of elements) 3 but not a linearly independent set with cardinality 4. Another definition is that dim V=3 if and only if every basis for V has cardinality 3. (If we consider it to be already proved that all bases have the same cardinality, then it's sufficient to verify that the basis we've found has cardinality 3).

So to complete 2 and 3, you at least have to prove linear independence. You also need to think about 1.

Since you have already proved that the sigmas span V, I don't mind showing you the prettiest way to do it: Let $$\begin{bmatrix}a & b\\ c & d\end{bmatrix}\in V$$ be arbitrary. The definition of V implies that I am a = I am d = 0, d=-a, and b=c*. So we can define ##x_1,x_2,x_3\in\mathbb R## by
$$a=x_3,\quad c=x_1+ix_2.$$ We have
$$\begin{bmatrix}a & b\\ c & d\end{bmatrix} =\begin{bmatrix}x_3 & x_1-ix_2\\ x_1+ix_2 & -x_3\end{bmatrix}=\sum_{k=1}^3 x_k\sigma_k.$$ (I see that you've been writing the matrix I call ##\sigma_3## first when you list the spin matrices. If that means that your book calls it ##\sigma_1##, then you will have to modify the above slightly).
 

Similar threads

Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
6K
Replies
15
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
Replies
0
Views
892
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K