How can the dual tensors derivation be achieved using rotation matrices?

  • #1
euphoricrhino
22
7
Hello,
I'm reading Group Theory in a nutshell for physicist by A Zee. When he introduced Dual tensors (pp 192), he made a claim with a light hint, and I have had great trouble deriving this claim, any help would be appreciated -

Let ##R \in SO(N)## be an ##N##-dimensional rotation, then the following is true
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}=\epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
$$
(where ##\epsilon## is the antisymmetric symbol and the above uses repeated index summing convention).
The hint was to use the ##N\times N## matrix determinant
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}=\epsilon^{pqr\cdots s} \mbox{det}R=\epsilon^{pqr\cdots s} \quad(\mbox{since }R\mbox{ is special})
$$
and multiply it "by a bunch of ##R^T##s carrying appropriate indices".

I have tried to understand the claim with ##N=3## which I think is the cross product relation, but I couldn't see how that could be obtained by involving ##R^T##, and how it could be extended to ##N## dimensions.

Thanks for the help!
 
  • Like
Likes yanpengju
Physics news on Phys.org
  • #2
I don't have Zee's book but does he talk about,

##\epsilon^{ijk\cdots n}R^{ip}R^{jq}\cdots R^{ns} = \det(R)\epsilon^{pqr\cdots s}##

which I think this follows from the definition of the determinate. Now, for ##SO(N)## one has ##\det(R) = 1##. From this one applies the group relation ##R^{ij}R^{ik} = \delta^{jk}## to both sides twice and you have it. Hope this helps.
 
  • #3
Thanks for the reply!

However I must be missing something really obvious, I don't see how to "apply the group relation ##R^{ij}R^{ik}=\delta^{jk}## to both sides" of the determinant equality.

The LHS of the determinant relation is a sum of ##N!## terms, each of which is a product whose factors don't share any index. Can you kindly elaborate for the ##N=3## case here?

From determinant equality
$$
\epsilon^{ijk}R^{ip}R^{jq}R^{kr}=\epsilon^{pqr}
$$
where ##(pqr)## is a given permutation
how to derive (for any given ##p,q,k##)
$$
\epsilon^{ijk}R^{ip}R^{jq}=\epsilon^{pqr}R^{kr}
$$

Thank you very much!
 
  • #4
In matrix form

##R^TR = RR^T = I##

so, one also has the relation,

##R^{ip}R^{kp} = \delta^{ik}##
 
  • Like
Likes euphoricrhino
  • #5
I finally figured it out, it's actually quite simple, but all the symbols there have been distracting.

The determinant relation
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}=\epsilon^{pqr\cdots s}
$$
can be viewed as an inner product relation
$$
v^nR^{ns}=\epsilon^{pqr\cdots s}
$$
where ##v^n## is defined by
$$
v^n=\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots
$$
Since columns ##R^{\cdot s}## form an orthonormal basis of the ##N##-dimensional space, the inner product relation above actually gives the decomposition of vector ##v## into this basis, i.e.
$$
v=\epsilon^{pqr\cdots s}R^{\cdot s}
$$
Taking the ##n##-th component of this yields
$$
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots=v^n=\epsilon^{pqr\cdots s}R^{ns}
$$

Now we just need to repeat the same argument on all the other ##R##s on the left until only ##R^{ip}## and ##R^{jq}## were left
 
  • #6
I have been struggling with this question for days and came across here. I don't understand euphoricrhino's final solution, but instead got my own derivations here. Hope it helps anyone who might Google here in the future.

The key obstacle here is that Zee didn't explicitly specify the indices to be summed on. That's Einstein's fault anyway, for his invention of summation convention messes all it up. :smile:

What we have and want to prove are:

$$\begin{aligned}
\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns} &= \epsilon^{pqr\cdots s} \\
\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
\end{aligned}$$

Let's make the summation operation explicit:

$$\begin{aligned}
\sum_{ijk\cdots n}\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns} &= \epsilon^{pqr\cdots s} \\
\sum_{ij}\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \sum_{r\cdots s}\epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
\end{aligned}$$

(You may stop here and continue on your own derivation if you like.)

We introduce some fixed indices ##k^{\prime},\cdots,n^{\prime}## and multiply the following same terms on both sides like this:

$$\sum_{ijk\cdots n}\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}\cdot R^{k^{\prime}r}\cdots R^{n^{\prime}s}=\epsilon^{pqr\cdots s}\cdot R^{k^{\prime}r}\cdots R^{n^{\prime}s}$$

Summing both sides over indices ##r\cdots s##, then the left side becomes:

$$\begin{aligned}
& \sum_{r\cdots s}\sum_{ijk\cdots n}\epsilon^{ijk\cdots n}R^{ip}R^{jq}R^{kr}\cdots R^{ns}\cdot R^{k^{\prime}r}\cdots R^{n^{\prime}s} \\
=& \sum_{ij}R^{ip}R^{jq}\sum_{k\cdots}\sum_{r\cdots}R^{kr}\cdots R^{k^{\prime}r}\cdots\sum_{n}\epsilon^{ijk\cdots n}\sum_{s}R^{ns}R^{n^{\prime}s} \\
=& \sum_{ij}R^{ip}R^{jq}\sum_{k\cdots}\sum_{r\cdots}R^{kr}\cdots R^{k^{\prime}r}\cdots\sum_{n}\delta^{nn^{\prime}}\epsilon^{ijk\cdots n} \\
=& \sum_{ij}R^{ip}R^{jq}\sum_{k\cdots}\sum_{r\cdots}\epsilon^{ijk\cdots n^{\prime}}R^{kr}\cdots R^{k^{\prime}r}\cdots \\
=& \cdots \\
=& \sum_{ij}\epsilon^{ijk^{\prime}\cdots n^{\prime}}R^{ip}R^{jq}
\end{aligned}$$

Rename ##k^{\prime},\cdots,n^{\prime}## to ##k,\cdots,n## and remove the explicit summation, we get the equation finally:

$$\begin{aligned}
\sum_{ij}\epsilon^{ijk^{\prime}\cdots n^{\prime}}R^{ip}R^{jq} &= \sum_{r\cdots s}\epsilon^{pqr\cdots s}R^{k^{\prime}r}\cdots R^{n^{\prime}s} \\
\sum_{ij}\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \sum_{r\cdots s}\epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns} \\
\epsilon^{ijk\cdots n}R^{ip}R^{jq} &= \epsilon^{pqr\cdots s}R^{kr}\cdots R^{ns}
\end{aligned}$$
 

Similar threads

Replies
5
Views
998
Replies
1
Views
4K
Replies
3
Views
2K
Replies
3
Views
1K
Replies
35
Views
27K
Replies
20
Views
4K
Back
Top