Pauli Spin Matrices: Showing Trace 0 Hermitian 2x2 Matrices in 3D Space

In summary: A).Then, for every element a of matrix A, there exists a unique Pauli spin matrix (P a ) such that P a has trace 0.This is because any linear combination of P a 's is still a hermitian matrix, but with a different trace (since P a has a different sign).
  • #1
skrat
748
8
Show that all hermitian 2x2 matrices with trace 0 are elements of three dimensional vector space in [itex]\mathbb{R}[/itex], which basis vectors are Pauli spin matrices.

Any clues on how to begin? :/
 
Physics news on Phys.org
  • #2
What does it mean that the Pauli matrices are a basis of the hermitian, traceless 2x2 matrices? Consider the definition of a basis.
 
  • #3
Basis vectors are linearly independent and any other vector from the same space is a linear combination of basis vectors.
Meaning, according to this definition I have to prove/show, any traceless 2x2 matrix is a linear combination of Pauli matrices. So...?
 
  • #4
Indeed. You will have to show two things.

1) The Pauli spin matrices are linearly independent. So, what does this mean? What's the definition?
2) The Pauli spin matrices span the space. So, what's the definition here?
 
  • #5
micromass said:
Indeed. You will have to show two things.

1) The Pauli spin matrices are linearly independent. So, what does this mean? What's the definition?
2) The Pauli spin matrices span the space. So, what's the definition here?

1) I know how to prove that they are linearly independent. But, what does this mean? Can you be more specific? What is your hint here?

3) The fact that matrices are hermitian is not important?
 
  • #6
Ok, here is what I found out (I have to write something because forum moderators again gave me a warning, this time for not knowing how to start solving a problem... thanks guys.)

Let's say that Pauli matrices are basis vectors [itex]:=\begin{Bmatrix}
\begin{bmatrix}
0 &1 \\
1& 0
\end{bmatrix},\begin{bmatrix}
0 &-i \\
i &0
\end{bmatrix},\begin{bmatrix}
1 & 0\\
0 & -1
\end{bmatrix}
\end{Bmatrix}[/itex] for vector space [itex]V[/itex]. (I checked their linear independence, but there is no need to write the proof here).

Since Pauli matrices are basis vectors for vector space [itex]V[/itex], any other vector from [itex]V[/itex] is a linear combination of basis vectors:

[itex]\alpha \begin{bmatrix}
0 &1 \\
1& 0
\end{bmatrix}+\beta \begin{bmatrix}
0 &-i \\
i &0
\end{bmatrix}+\gamma \begin{bmatrix}
1 & 0\\
0 & -1
\end{bmatrix}=\begin{bmatrix}
x_{1} & x_{2}\\
x_{3}& x_{4}
\end{bmatrix}[/itex], where [itex]\begin{bmatrix}
x_{1} & x_{2}\\
x_{3}& x_{4}
\end{bmatrix} = X[/itex] is a matrix in [itex]V[/itex]. If I sum the left side, than I get something like this[itex] \begin{bmatrix}
\gamma & \alpha -\beta i\\
\alpha +\beta i& -\gamma
\end{bmatrix}=\begin{bmatrix}
x_{1} & x_{2}\\
x_{3}& x_{4}
\end{bmatrix}[/itex].

Now we can see that for any [itex]\gamma[/itex] the [itex]X[/itex] is traceless because [itex]trX=\gamma +(-\gamma )=0[/itex] and [itex]X[/itex] is also hermitian (complex conjugation and transponsed):
[itex]X=\begin{bmatrix}
\gamma & \alpha -\beta i\\
\alpha +\beta i& -\gamma
\end{bmatrix}, X^{H}=\begin{bmatrix}
\gamma & \overline{\alpha +\beta i}\\
\overline{\alpha -\beta i}& -\gamma
\end{bmatrix}=X^{H}=\begin{bmatrix}
\gamma & \alpha -\beta i\\
\alpha +\beta i& -\gamma
\end{bmatrix}[/itex], where [itex]\alpha ,\beta ,\gamma \in \mathbb{R}[/itex] so [itex]V[/itex] is also three dimensional or in other words, [itex]dimV=3[/itex]

Doesn this sound about right?
 
  • #7
skrat said:
Since Pauli matrices are basis vectors for vector space [itex]V[/itex]

But you need to prove this!
 
  • #8
micromass said:
But you need to prove this!

Am... Now I am having some troubles understanding this. We often made proofs in two directions:

- one direction would be: If 2x2 matrix is hermitian and traceless than it is a combination of Pauli spin matrices (because they are basis vectors)
- other way: any linear combination of basis vectors (which are pauli spin matrices) is a hermitian and traceless 2x2 matrix.

If I understand you correctly, you want to see this:

[itex]N:=\begin{Bmatrix}
\begin{bmatrix}
0 &1 \\
1& 0
\end{bmatrix},\begin{bmatrix}
0 &-i \\
i& 0
\end{bmatrix},\begin{bmatrix}
1 &0 \\
0& -1
\end{bmatrix}
\end{Bmatrix}[/itex]. Where [itex]N[/itex] is linearly independent because [itex]\begin{bmatrix}
0 &1 &1& 0\\
0 &-i&i& 0\\
1 &0 &0& -1
\end{bmatrix}\sim \begin{bmatrix}
1 &0 &0& -1\\
0 &1 &1& 0\\
0 &-i&i& 0
\end{bmatrix}\sim \begin{bmatrix}
1 &0 &0& -1\\
0 &1 &1& 0\\
0 &0&2i& 0
\end{bmatrix}[/itex].

Let vector space [itex]V[/itex] be space of 2x2 hermitian and traceless matrices. Now [itex]dimV=3[/itex] (because the fact that it is traceless takes one dimension) just as [itex]dimN=3[/itex]. Both vector spaces have the same dimension and since N is also linearly independent it is basis for vector space [itex]V[/itex]

right?
 
  • #9
skrat said:
Ok, here is what I found out (I have to write something because forum moderators again gave me a warning, this time for not knowing how to start solving a problem... thanks guys.)
That was me, and you're welcome.
The warning was for not including what you've tried, which is a requirement according to the PF rules.

skrat said:
Am... Now I am having some troubles understanding this. We often made proofs in two directions:

- one direction would be: If 2x2 matrix is hermitian and traceless than it is a combination of Pauli spin matrices (because they are basis vectors)
- other way: any linear combination of basis vectors (which are pauli spin matrices) is a hermitian and traceless 2x2 matrix.

If the statement you're trying to prove has "if and only if" in it, then the proof needs to go both ways. The one in this thread is NOT one of those.
skrat said:
Show that all hermitian 2x2 matrices with trace 0 are elements of three dimensional vector space in R , which basis vectors are Pauli spin matrices.
Start with an arbitrary 2 x 2 hermitian matrix (maybe call it H) whose trace is 0. Show that the space of such matrices is three-dimensional, and that a basis for this space is Pauli spin matrices.
 
  • #10
I want to offer a very general comment here. The statement you want to prove is of the form "For all ##x\in V##,..." where V is some set. When the statement begins that way, the proof should usually begin with the words
Let ##x\in V## be arbitrary.​
Obviously you can use another symbol instead of x. The point here is that you should only need to glance at the statement you want to prove to see how the proof should begin. The obvious next step is to use the definition of V. This is what Mark44 is already telling you to do.
 
Last edited:
  • #11
skrat said:
Where [itex]N[/itex] is linearly independent because [itex]\begin{bmatrix}
0 &1 &1& 0\\
0 &-i&i& 0\\
1 &0 &0& -1
\end{bmatrix}\sim \begin{bmatrix}
1 &0 &0& -1\\
0 &1 &1& 0\\
0 &-i&i& 0
\end{bmatrix}\sim \begin{bmatrix}
1 &0 &0& -1\\
0 &1 &1& 0\\
0 &0&2i& 0
\end{bmatrix}[/itex].
I'm not sure what you're doing here, but when you get to the linear independence, you should just work directly with the definition of "linearly independent".

skrat said:
Let vector space [itex]V[/itex] be space of 2x2 hermitian and traceless matrices. Now [itex]dimV=3[/itex] (because the fact that it is traceless takes one dimension) just as [itex]dimN=3[/itex]. Both vector spaces have the same dimension and since N is also linearly independent it is basis for vector space [itex]V[/itex]
You should let V be the set of complex 2×2 traceless hermitian matrices, and then prove that it's a vector space over ℝ (not over ℂ). How to prove dim V=3 depends on your definition of dim. So how do you define it?
 
  • #12
Mark44 said:
If the statement you're trying to prove has "if and only if" in it, then the proof needs to go both ways. The one in this thread is NOT one of those.
That is something that needs to go on my wall! Thanks!
Mark44 said:
Start with an arbitrary 2 x 2 hermitian matrix (maybe call it H) whose trace is 0. Show that the space of such matrices is three-dimensional, and that a basis for this space is Pauli spin matrices.

Well, that is something I (THINK) I already did, didn't I?

So let [itex]X=\begin{bmatrix}
x_{1} & x_{2}\\
x_{3} & x_{4}
\end{bmatrix}[/itex] be arbitrary matrix and [itex]X\in V[/itex] where [itex]V:=\begin{Bmatrix}H=\begin{bmatrix}
h_{1} &h_{2} \\
h_{3}& h_{4}
\end{bmatrix},h_{i}\in \mathbb{R}
; tr(H)=0
\end{Bmatrix}[/itex].

[itex]dimV=2[/itex] because [itex]tr(H)=h_{1}+h_{4}=0[/itex] so [itex]h_{1}=-h_{4} [/itex]and since [itex]H[/itex] is hermitian [itex]x_{2}=\overline{x_{3}}[/itex] any matrix [itex]X[/itex] from [itex]V[/itex] is than written as [itex]X=\begin{bmatrix}
x_{1} &x_{2} \\
x_{3}& x_{4}
\end{bmatrix}=\begin{bmatrix}
-x_{4} &\overline{x_{3}} \\
x_{3}& x_{4}
\end{bmatrix}[/itex]. So [itex]X=x_{4}\begin{bmatrix}
-1&0 \\
0&1
\end{bmatrix}+ x_{3}\begin{bmatrix}
0&-i \\
i& 0
\end{bmatrix}[/itex]. At this point, I think something is wrong? Is the fact that matrix is hermitian irrelevant for dimension of [itex]V[/itex]?

Fredrik said:
I'm not sure what you're doing here, but when you get to the linear independence, you should just work directly with the definition of "linearly independent".
Firstly, thanks for the first general comment. Another one going on my wall of things you must remember.
Back to the topic: Ammm, everything is a bit mixed up. :) In this post I hope to get the right order.
 
  • #13
The matrix elements are complex, so x3 ≠ x3i.
 
  • #14
Mark44 said:
The matrix elements are complex, so x3 ≠ x3i.

Indeed an important fact. So:
[itex]dimV=3[/itex] because [itex]tr(H)=h_{1}+h_{4}=0[/itex] so [itex]h_{1}=-h_{4} [/itex]and since [itex]H[/itex] is hermitian [itex]x_{2}=\overline{x_{3}}[/itex] any matrix [itex]X[/itex] from [itex]V[/itex] is than written as [itex]X=\begin{bmatrix}
x_{1} &x_{2} \\
x_{3}& x_{4}
\end{bmatrix}=\begin{bmatrix}
-x_{4} &\overline{x_{3}} \\
x_{3}& x_{4}
\end{bmatrix}[/itex]. So [itex]X=x_{4}\begin{bmatrix}
-1&0 \\
0&1
\end{bmatrix}+ Re(x_{3})\begin{bmatrix}
0&1 \\
1& 0
\end{bmatrix}+Im(x_{3})\begin{bmatrix}
0&-1 \\
1& 0
\end{bmatrix}[/itex] (or is it [itex]+Im(x_{3})\begin{bmatrix}
0&-i \\
i& 0
\end{bmatrix}[/itex] ?)

Anyway, [itex]dimV=3[/itex] and since those three vectors are also linearly independent they form basis for V.
 
  • #15
skrat said:
Indeed an important fact. So:
[itex]dimV=3[/itex] because [itex]tr(H)=h_{1}+h_{4}=0[/itex] so [itex]h_{1}=-h_{4} [/itex]and since [itex]H[/itex] is hermitian [itex]x_{2}=\overline{x_{3}}[/itex] any matrix [itex]X[/itex] from [itex]V[/itex] is than written as [itex]X=\begin{bmatrix}
x_{1} &x_{2} \\
x_{3}& x_{4}
\end{bmatrix}=\begin{bmatrix}
-x_{4} &\overline{x_{3}} \\
x_{3}& x_{4}
\end{bmatrix}[/itex]. So [itex]X=x_{4}\begin{bmatrix}
-1&0 \\
0&1
\end{bmatrix}+ Re(x_{3})\begin{bmatrix}
0&1 \\
1& 0
\end{bmatrix}+Im(x_{3})\begin{bmatrix}
0&-1 \\
1& 0
\end{bmatrix}[/itex] (or is it [itex]+Im(x_{3})\begin{bmatrix}
0&-i \\
i& 0
\end{bmatrix}[/itex] ?)
The latter of the two. You're trying to represent two basis elements for
$$ \begin{bmatrix}
0&a_3 - b_3i \\
a_3 + b_3i & 0
\end{bmatrix}$$

where x3 = a3 + b3i

The two matrices could also be represented as
$$ a_3\begin{bmatrix}
0 & 1\\
1 & 0
\end{bmatrix}~\text{and}~ b_3i\begin{bmatrix}
0& -1 \\
1& 0
\end{bmatrix}$$
skrat said:
Anyway, [itex]dimV=3[/itex] and since those three vectors are also linearly independent they form basis for V.
 
  • #16
The result dim V=3 is certainly related to the fact that three real numbers are sufficient to determine all the components of an arbitrary X in V. But to prove that dim V=3, you have to use a definition of "dim".
 
  • #17
Fredrik said:
The result dim V=3 is certainly related to the fact that three real numbers are sufficient to determine all the components of an arbitrary X in V. But to prove that dim V=3, you have to use a definition of "dim".

Ok, how do I than prove that dimV=3?

At my university it was enough if we just wrote that any element from V is linear combination of basis vectors. Number of basis vectors is than equal to dimension of vector space.

So [itex]X=x_{4}\begin{bmatrix}
-1&0 \\
0&1
\end{bmatrix}+ Re(x_{3})\begin{bmatrix}
0&1 \\
1& 0
\end{bmatrix}+Im(x_{3})\begin{bmatrix}
0&-i \\
i& 0
\end{bmatrix}[/itex]

The three matrices here are linear independent, therefore [itex]\begin{bmatrix}
-1&0 \\
0&1
\end{bmatrix}[/itex], [itex]\begin{bmatrix}
0&1 \\
1& 0
\end{bmatrix}[/itex], [itex]\begin{bmatrix}
0&-i \\
i& 0
\end{bmatrix}[/itex] which also happen to be Pauli spin matrices, form basis for V (V is vector space of 2x2 hermitian and traceless matrices).
 
  • #18
BTW, thanks for all the help to all of you!
 
  • #19
You haven't posted a proof of any of the following: 1. V is a vector space over ℝ. 2. ##\{\sigma_1,\sigma_2,\sigma_3\}## is a basis. (Comment below). 3. dim V=3. Maybe you've done more than you've shown us?

What exactly you need to show for 2 and 3 above depends on your definitions of "basis" and "dimension". That's why I asked for your definition of "dim". I should also have asked for your definition of "basis". Some of the valid definitions of "basis" are: "a maximal linearly independent set", "a minimal spanning set", "a linearly independent spanning set". You have only shown that it's a spanning set. The definition of dim V=3 that I use is that V contains a linearly independent set with cardinality (number of elements) 3 but not a linearly independent set with cardinality 4. Another definition is that dim V=3 if and only if every basis for V has cardinality 3. (If we consider it to be already proved that all bases have the same cardinality, then it's sufficient to verify that the basis we've found has cardinality 3).

So to complete 2 and 3, you at least have to prove linear independence. You also need to think about 1.

Since you have already proved that the sigmas span V, I don't mind showing you the prettiest way to do it: Let $$\begin{bmatrix}a & b\\ c & d\end{bmatrix}\in V$$ be arbitrary. The definition of V implies that I am a = I am d = 0, d=-a, and b=c*. So we can define ##x_1,x_2,x_3\in\mathbb R## by
$$a=x_3,\quad c=x_1+ix_2.$$ We have
$$\begin{bmatrix}a & b\\ c & d\end{bmatrix} =\begin{bmatrix}x_3 & x_1-ix_2\\ x_1+ix_2 & -x_3\end{bmatrix}=\sum_{k=1}^3 x_k\sigma_k.$$ (I see that you've been writing the matrix I call ##\sigma_3## first when you list the spin matrices. If that means that your book calls it ##\sigma_1##, then you will have to modify the above slightly).
 

1. What are Pauli Spin Matrices?

Pauli Spin Matrices are a set of three 2x2 matrices used in quantum mechanics to describe the spin of a particle. They are named after physicist Wolfgang Pauli and are denoted by the symbols σx, σy, and σz.

2. How are Pauli Spin Matrices represented in 3D space?

In 3D space, the Pauli Spin Matrices are represented as vectors with the x, y, and z components corresponding to the σx, σy, and σz matrices, respectively.

3. What is the significance of showing trace 0 hermitian 2x2 matrices?

The trace of a matrix is defined as the sum of its diagonal elements. For Pauli Spin Matrices, the trace is always 0. This is significant because it reflects the rotational symmetry of these matrices, which is a fundamental property in quantum mechanics. Additionally, Pauli Spin Matrices are Hermitian, meaning they are equal to their own complex conjugate transpose. This property is important in quantum mechanics as it ensures that the matrices have real-valued eigenvalues.

4. How do Pauli Spin Matrices relate to spin in quantum mechanics?

In quantum mechanics, spin is a fundamental property of particles that can have values of either +1/2 or -1/2. The Pauli Spin Matrices are used to represent this property and can be used to calculate the spin of a particle in different directions.

5. What applications do Pauli Spin Matrices have?

Pauli Spin Matrices have various applications in quantum mechanics, including in the study of subatomic particles and in the development of quantum computing. They are also used in the field of nuclear magnetic resonance (NMR) spectroscopy, which is used in medical imaging and chemical analysis.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
583
  • Linear and Abstract Algebra
Replies
11
Views
4K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
131
  • Calculus and Beyond Homework Help
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
Replies
5
Views
3K
  • Calculus and Beyond Homework Help
Replies
3
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
972
Back
Top