# Addition of angular momenta, rotation operators

Hey,

I have a question regarding the invariance of a 'mixed' Casimir operator under rotation,

By 'mixed' Casimir operator I refer to:

$\vec{J}_1\cdot \vec{J}_2$

Where J1 and J2 are two independent angular momenta.

I want to show that this 'mixed' Casimir operator is invariant under rotations,

The rotation matrix for J1 + J2 will be of the form,

$D(R) = D_1 (R)\otimes D_2 (R)$

Where D1 and D2 are the rotation matrices for J1 and J2.

So what I am really trying to show is that:

$D^{\dagger}(R) \vec{J}_1\cdot \vec{J}_2 D(R) = \vec{J}_1\cdot \vec{J}_2$

But I am having trouble seeing how to do this.

I have read in Schwinger's paper that the rotation matrix for J1 + J2 is of the form:

D1D2

But does that mean that $D_2 (R) D_1 (R) = D_1 (R)\otimes D_2 (R)$ ?

If that is the case then I now have:

$D_{2}^{\dagger }D_{1}^{\dagger }{{J}_{1}}{{J}_{2}}{{D}_{1}}{{D}_{2}}$

And expanding the product gives:

$D_{2}^{\dagger }D_{1}^{\dagger }{{J}_{1,x}}{{J}_{2,x}}{{D}_{1}}{{D}_{2}}+\,\,D_{2}^{\dagger }D_{1}^{\dagger }{{J}_{1,y}}{{J}_{2,y}}{{D}_{1}}{{D}_{2}}+D_{2}^{\dagger} D_{1}^{\dagger }{{J}_{1,z}}{{J}_{2,z}}{{D}_{1}}{{D}_{2}}$

And some operators can be switched around since the the D1 commutes with J2 and so on,

But this gets me to here:

$D_{1}^{\dagger }{{J}_{1,x}}{{D}_{1}}D_{2}^{\dagger }{{J}_{2,x}}{{D}_{2}}+\,\,D_{1}^{\dagger }{{J}_{1,y}}{{D}_{1}}D_{2}^{\dagger }{{J}_{2,y}}{{D}_{2}}+D_{1}^{\dagger }{{J}_{1,z}}{{D}_{1}}D_{2}^{\dagger }{{J}_{2,z}}{{D}_{2}}$

And the D1 and J1 operators don't necessarily commute so I'm not sure how to progress from here,

Does anyone have any ideas?

Jess

Related Advanced Physics Homework Help News on Phys.org
But does that mean that $D_2 (R) D_1 (R) = D_1 (R)\otimes D_2 (R)$ ?
That looks right to me because the rotations in the two different spaces must commute.

I think you can get your result immediately by using the fact that any J is a "vector operator",
$$D^{\dagger} J_i D = R_{ik}J_k.$$

Hey Oxvillan,

Sorry what I was trying to ask was if the tensor product between D_1 and D_2 is equal to the multiplication of the two rotation matrices, D_1 D_2. I have worked out why this is true now.

Sorry I just realised I was inconsistent with my indexes and wrote the 1 and 2 back to front.

Would you happen to know where I could find more information regarding

$$D^{\dagger} J_i D = R_{ik}J_k.$$

The i = x, y or z right? But what does the k represent?

Last edited:
Hey Oxvillan,

Sorry what I was trying to ask was if the tensor product between D_1 and D_2 is equal to the multiplication of the two rotation matrices, D_1 D_2. I have worked out why this is true now.
Ah I see I would write something like

$$D_1 D_2 = (D \otimes I)(I \otimes D) = D \otimes D.$$

Would you happen to know where I could find more information regarding

$$D^{\dagger} J_i D = R_{ik}J_k.$$

The i = x, y or z right? But what does the k represent?

I would recommend Shankar chapter 12.

The index k also runs from 1 to 3 (for x,y and z). Rik is just a rotation matrix.

The idea here is that we have a "unitary representation of the rotation group". Ordinary 3-vectors are rotated using rotation matrices, but state vectors in a Hilbert space are rotated using unitary operators. For things to be consistent, a vector operator (eg. the set of operators {Jx, Jy, Jz}) has to transform according to the relation I gave.

Cool thanks alot!

So this reduces

$D_{1}^{\dagger }{{J}_{1,x}}{{D}_{1}}D_{2}^{\dagger }{{J}_{2,x}}{{D}_{2}}+\,\,D_{1}^{\dagger }{{J}_{1,y}}{{D}_{1}}D_{2}^{\dagger }{{J}_{2,y}}{{D}_{2}}+D_{1}^{\dagger }{{J}_{1,z}}{{D}_{1}}D_{2}^{\dagger }{{J}_{2,z}}{{D}_{2}}$

To

${{R}_{1,xk}}{{J}_{1,x}}{{R}_{2,xk}}{{J}_{2,x}}+\,\,{{R}_{1,yk}}{{J}_{1,y}}{{R}_{2,yk}}{{J}_{2,y}}+{{R}_{1,zk}}{{J}_{1,z}}{{R}_{2,zk}}{{J}_{2,z}}$

Then as the R_2 matrices commute with the J_1 operators

${{R}_{1,xk}}{{R}_{2,xk}}{{J}_{1,x}}{{J}_{2,x}}+\,\,{{R}_{1,yk}}{{R}_{2,yk}}{{J}_{1,y}}{{J}_{2,y}}+{{R}_{1,zk}}{{R}_{2,zk}}{{J}_{1,z}}{{J}_{2,z}}$

Does this look right?

So this implies that the two rotation matrices in front of each J_1 J_2 component must equal to the identity matrix, but how is this possible?

You're almost there! I'd recommend just using the Einstein summation convention - then you only have to write out a third as many terms...

There should not be 1 or 2 subscripts on the R's because R is just your rotation matrix:

$${\bf J}_1 \cdot {\bf J}_2 = J_{1i} J_{2i} → R_{ip} J_{1p} R_{iq} J_{2q}$$
One more little trick and you're home and dry...

I am having a bit of trouble seeing what the trick is,

Does

$$R_{ip} J_{1p} R_{iq} J_{2q}=J_{1p} J_{2q}$$

Because the whole system is being rotated by the same rotation?

Then p and q both run from 1-3 right, (x y and z)?

But I'm really not sure,

Or is it something like $$R_{iq}=R^T_{ip} ?$$

Thanks again for helping Oxvillian

I am having a bit of trouble seeing what the trick is,

Does

$$R_{ip} J_{1p} R_{iq} J_{2q}=J_{1p} J_{2q}$$
No, that wouldn't make sense because p and q are dummy indices that we are summing over,
$$\sum_i \sum_p \sum_q R_{ip} J_{1p} R_{iq} J_{2q}.$$
What you want to do first is sum over i. Take a look at
$$\sum_i R_{ip} R_{iq} = \sum_i R^T_{pi} R_{iq}.$$
How can you simplify that, bearing in mind that R is a rotation matrix?

Do you mean that the R's are orthogonal (since they're rotation matrices) so

$R^T_{pi}R_{iq} = \hat{1}$

Is that correct? It seems like the indices would make it incorrect.

Haha, I don't know why but the indices are giving me the most confusing,

How come it is okay to write,

$R^T_{pi} = R_{ip}$

Sorry if I am being a nuisance.

Do you mean that the R's are orthogonal (since they're rotation matrices) so

$R^T_{pi}R_{iq} = \hat{1}$

Is that correct? It seems like the indices would make it incorrect.

Haha, I don't know why but the indices are giving me the most confusing,
You're mixing up the matrix notation and the index notation here. The matrix notation is
$$R^T R = I$$
while the index notation would be
$$\sum_i R^T_{pi} R_{iq} = \delta_{pq}$$

How come it is okay to write,

$R^T_{pi} = R_{ip}$
That's just the definition of the transpose of a matrix Ah okay cool,

Thanks!

So

$$\sum_i R_{ip} R_{iq} = \sum_i R^T_{pi} R_{iq} = \delta_{pq}$$

Then

$$\sum_i \sum_p \sum_q R_{ip} J_{1p} R_{iq} J_{2q}= \sum_p \sum_q \delta_{pq} J_{1p} J_{2q} = \sum_p J_{1p} J_{2p} = J_{1x}J_{2x} + J_{1y}J_{2y} + J_{1z}J_{2z}$$

Which is what I needed!

Thanks heaps!