Levi - Civita tensor and operations with it

  • Thread starter Thread starter Yegor
  • Start date Start date
  • Tags Tags
    Operations Tensor
AI Thread Summary
The discussion revolves around simplifying the expression for the square of the cross product of two vectors, specifically (\vec{A} \times \vec{B})^2. Participants clarify that this expression can be represented using the Levi-Civita symbol and the Kronecker delta, leading to the identity \epsilon_{\lambda\mu\nu} \epsilon_{\lambda\rho\sigma} = \delta_{\mu\rho}\delta_{\nu\sigma} - \delta_{\mu\sigma}\delta_{\nu\rho}. The conversation emphasizes the importance of consistent notation in tensor calculations and provides a mnemonic for understanding the epsilon-delta identities. Acknowledgment of self-study in tensor calculus is noted, along with encouragement to explore further vector algebra identities. Overall, the thread highlights foundational concepts in tensor notation and vector operations.
Yegor
Messages
147
Reaction score
2
I read about symbols which simplify representing vectorial operations.
For example
A_\mu\hat{e_\mu}=\sum_{i=1}^{3} A_i\hat{e_i}=\vec{A}
also
\vec{A}\times\vec{B}=\sum_{i,j,k=1}^{3} \epsilon_{ijk}\hat{e_i}A_j B_k = \epsilon_{\lambda\mu\nu} \hat{e_\lambda} A_\mu B_\nu
As an exercise i have to simplify (\vec{A}\times\vec{B})^2.
Can anybody help me? I don't know what to do with (\epsilon_{\lambda\mu\nu} \hat{e_\lambda} A_\mu B_\nu)^2.
Thank you
 
Physics news on Phys.org
Most tensorial calculations I see leave out the basis vector. You can restore it if you wish.
\vec{A}\times\vec{B}= \epsilon_{\lambda\mu\nu} A_\mu B_\nu
is a vector (with, in my notation, one "free" index and two "dummy" indices), as you indicate.

(\vec{A}\times\vec{B})^2 is a scalar.
In vector notation, this is (\vec{A}\times\vec{B})\cdot(\vec{A}\times\vec{B}). In tensorial notation, you introduce new dummy indices for each factor, then contract [via the metric, which I suppress for simplicity]:
(\vec{A}\times\vec{B})^2= \epsilon_{\lambda\mu\nu} A_\mu B_\nu \epsilon_{\lambda\rho\sigma} A_\rho B_\sigma. You can rewrite this is as \epsilon_{\lambda\mu\nu} \epsilon_{\lambda\rho\sigma} A_\mu B_\nu A_\rho B_\sigma.

Now, you have to use an identity to express \epsilon_{\lambda\mu\nu} \epsilon_{\lambda\rho\sigma} in terms of \delta_{\mu\rho}, which I assume you must have been introduced to.

That should get you started.
 
robphy said:
Most tensorial calculations I see leave out the basis vector. You can restore it if you wish.
\vec{A}\times\vec{B}= \epsilon_{\lambda\mu\nu} A_\mu B_\nu

This is bad notation: either write a vector equation (as Yegor has), or write a component equation (like \left( \vec{A}\times\vec{B} \right)_\lambda = \epsilon_{\lambda\mu\nu} A_\mu B_\nu, or \vec{C} = \vec{A}\times\vec{B} and C_\lambda = \epsilon_{\lambda\mu\nu} A_\mu B_\nu, but never set a vector equal to a number, or use both standard vector notation and Penrose's "abstract index notation" in the same equation.

Regards,
George
 
I agree about the missing lambda index. I would have done (and normally do) as you said. In fact, I would have used upper and lower indices... and use an explicit metric. However, I didn't want to clutter the discussion with introducing too much of my own symbols. (I think of a single free index like an arrowhead.)

I don't think I ever set a vector equal to a number... unless you are interpreting the greek-indexed quantities as numbers---I am thinking of them as "slots" (in my notation).

Maybe when intended for a beginner, it would have been best to be fully consistent with notation... even if I had to explain all of the notation. I was merely trying to get the following idea across: use a new set of dummy indices and use the epsilon-delta identity.

In any case, point taken. Thanks.
 
Thank you Robphy for the hint.
This is what i got.
\epsilon_{\lambda\mu\nu} \epsilon_{\lambda\rho\sigma} =\delta_{\mu\rho}\delta_{\nu\sigma}-\delta_{\mu\sigma}\delta_{\nu\rho}
then
(\delta_{\mu\rho}\delta_{\nu\sigma}-\delta_{\mu\sigma}\delta_{\nu\rho}) A_\mu B_\nu A_\rho B_\sigma=\delta_{\mu\rho}\delta_{\nu\sigma}A_\mu B_\nu A_\rho B_\sigma-\delta_{\mu\sigma}\delta_{\nu\rho}A_\mu B_\nu A_\rho B_\sigma=(A_\mu)^2 (B_\nu )^2-(A_\mu B_\mu)(A_\nu B_\nu)
Tell me please, is it OK?
In fact i wasn't introduced to anything about tensors. I just read book (Mechanics) by myself and there is an interesting appendix about it. Thus this \epsilon_{\lambda\mu\nu} \epsilon_{\lambda\rho\sigma} =\delta_{\mu\rho}\delta_{\nu\sigma}-\delta_{\mu\sigma}\delta_{\nu\rho} expression i understand very intuitively. I see that it is true, but if someone could show or give me an idea how it is derived precisely i'd be very grateful
 
Last edited:
Yes, that's correct... up to possible notational convention preferences.
You might recognize the so-called Lagrange identity:
|\vec A \times \vec B|^2+|\vec A \cdot \vec B|^2 = |\vec A|^2 |\vec B|^2.

I have never found a satisfactory derivation of the epsilon-delta identities.
However, here is a useful mnemonic involving the determinant (which you can verify)
\epsilon_{\lambda\mu\nu} \epsilon_{\lambda\rho\sigma} =<br /> \delta_{\mu\rho}\delta_{\nu\sigma}-\delta_{\mu\sigma}\delta_{\nu\rho}<br /> =\left|\begin{array}{cc}<br /> \delta_{\mu\rho} &amp; \delta_{\mu\sigma} \\<br /> \delta_{\nu\rho} &amp; \delta_{\nu\sigma}<br /> \end{array}\right|<br />
This can be generalized.

By the way, congratulations on teaching this to yourself. You might try learning to use these methods to derive various identities in vector algebra (like the BAC-CAB identity and the Jacobi identity) and vector calculus (which would be useful in, say, electrodynamics).
 
You might recognize the so-called Lagrange identity:
|\vec A \times \vec B|^2+|\vec A \cdot \vec B|^2 = |\vec A|^2 |\vec B|^2
Wheee! It's really very beautiful! I haven't noticed it before (but know this identity).
Thank you very much for your advices. It really looks like it worth having a practise in such things.
 
Oh. i just understood that (A_\mu)^2 (B_\nu )^2-(A_\mu B_\mu)(A_\nu B_\nu)=(A_\mu)^2 (B_\mu )^2-(A_\mu B_\mu)^2 !
 

Similar threads

Back
Top