MHB Proving the Pauli Matrix Identity with Ordinary Vectors: A Simplified Approach

Click For Summary
The discussion revolves around proving the Pauli matrix identity using ordinary vectors, specifically showing that (σ·a)(σ·b) = a·b I₂ + i σ·(a × b). The initial confusion stems from the difficulty in multiplying the Pauli matrices with ordinary vectors due to their different ranks. The participants clarify that σ is a vector of matrices, leading to the expression σ·a = σ_x a_x + σ_y a_y + σ_z a_z, which simplifies the calculations. One participant acknowledges the need to recognize the matrix-vector relationship more intuitively. The conversation highlights the complexity of the proof while emphasizing the importance of understanding the mathematical structure involved.
ognik
Messages
626
Reaction score
2
I'm not sure I have the right approach here:

Using the three 2 X 2 Pauli spin matrices, let $ \vec{\sigma} = \hat{x} \sigma_1 + \hat{y} \sigma_2 +\hat{z} \sigma_3 $ and $\vec{a}, \vec{b}$ are ordinary vectors,

Show that $ \left( \vec{\sigma} \cdot \vec{a} \right) \left( \vec{\sigma} \cdot \vec{b} \right) = \vec{a} \cdot \vec{b} I_2 + i \sigma \cdot \left( \vec{a} \times \vec{b}\right)$

I'm not sure how to go about this - the $i$ in the last term suggests to me that I have to laboriously multiply both sides ...

But the Pauli matrices are 2 X 2 , ex. $ \sigma_1 = \begin{bmatrix}0&1\\ 1&0\end{bmatrix}$ and $\vec{\sigma}$ appears to be Cartesian 3-D.

So I tried $ \vec\sigma \cdot \vec{a} = \hat{x} \sigma_1 \cdot \vec{a} + ... = \begin{bmatrix}1\\ 0 \\ 0 \end {bmatrix} \left( \begin{bmatrix}0&1\\1&0\end{bmatrix} \cdot \begin{bmatrix}a_1\\ a_2 \end {bmatrix} \right) + ...$

Not possible to multiply out like this, so keep the unit vectors as $\hat{x}$ etc. ...

I then get $ \vec\sigma \cdot \vec{a} = \hat{x} \begin{bmatrix}a_2 \\ a_1 \end {bmatrix} +\hat{y} \begin{bmatrix} -a_2\\ a_1 \end {bmatrix} +\hat{z} \begin{bmatrix}a_1\\ -a_2 \end {bmatrix} $ and for $\vec{\sigma} \cdot \vec{b}$ a very similar eqtn by symmetry.

But again I won't be able to multiply out $ \left( \vec\sigma \cdot \vec{a} \right) \left( \vec{\sigma} \cdot \vec{b} \right) $ because of different matrix ranks? Must be a better way to do this (one that also works :-))
 
Physics news on Phys.org
ognik said:
I'm not sure I have the right approach here:

Using the three 2 X 2 Pauli spin matrices, let $ \vec{\sigma} = \hat{x} \sigma_1 + \hat{y} \sigma_2 +\hat{z} \sigma_3 $ and $\vec{a}, \vec{b}$ are ordinary vectors,

Show that $ \left( \vec{\sigma} \cdot \vec{a} \right) \left( \vec{\sigma} \cdot \vec{b} \right) = \vec{a} \cdot \vec{b} I_2 + i \sigma \cdot \left( \vec{a} \times \vec{b}\right)$

I'm not sure how to go about this - the $i$ in the last term suggests to me that I have to laboriously multiply both sides ...

But the Pauli matrices are 2 X 2 , ex. $ \sigma_1 = \begin{bmatrix}0&1\\ 1&0\end{bmatrix}$ and $\vec{\sigma}$ appears to be Cartesian 3-D.

So I tried $ \vec\sigma \cdot \vec{a} = \hat{x} \sigma_1 \cdot \vec{a} + ... = \begin{bmatrix}1\\ 0 \\ 0 \end {bmatrix} \left( \begin{bmatrix}0&1\\1&0\end{bmatrix} \cdot \begin{bmatrix}a_1\\ a_2 \end {bmatrix} \right) + ...$

Not possible to multiply out like this, so keep the unit vectors as $\hat{x}$ etc. ...

I then get $ \vec\sigma \cdot \vec{a} = \hat{x} \begin{bmatrix}a_2 \\ a_1 \end {bmatrix} +\hat{y} \begin{bmatrix} -a_2\\ a_1 \end {bmatrix} +\hat{z} \begin{bmatrix}a_1\\ -a_2 \end {bmatrix} $ and for $\vec{\sigma} \cdot \vec{b}$ a very similar eqtn by symmetry.

But again I won't be able to multiply out $ \left( \vec\sigma \cdot \vec{a} \right) \left( \vec{\sigma} \cdot \vec{b} \right) $ because of different matrix ranks? Must be a better way to do this (one that also works :-))
To start with you need to figure out what [math]\vec{ \sigma } \cdot \vec{a}[/math] is. [math]\vec{\sigma} = \sigma _x ~ \hat{x} + \sigma _y ~ \hat{y} + \sigma _z ~ \hat{z}[/math]. This is a "vector" of matrices. You are "dotting" it with a 3-vector [math]< a_x,~a_y,~a_z >[/math]. So:
[math]\vec{ \sigma } \cdot \vec{a} = \sigma _x ~ a_x + \sigma _y ~ a_y + \sigma _z ~ a_z = \left ( \begin{matrix} a_z & a_x - i~a_y \\ a_x + i~a_y & -a_z \end{matrix} \right )[/math]

Can you take it from here? (I'd give you a really cool and elegant method but I don't have any tricks for this one.)

-Dan
 
Yup thanks, once you pointed out they were a vector. I really need to find a way to notice that sort of thing!

Got a sore hand now :-)
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
Replies
27
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K