Tensor notation for vector product proofs

Click For Summary
SUMMARY

This discussion focuses on using tensor notation to prove the equality of vector products involving the Levi-Civita symbol and the Kronecker Delta. The user seeks clarification on the correct assignment of indices when applying the Einstein summation convention to expressions like $$\vec{A}\bullet\vec{B}\times\vec{C}$$. Key insights include the understanding that the indices of the Levi-Civita symbol can be treated as arbitrary, allowing for flexibility in notation. The conversation also touches on calculating the magnitude of a cross product using tensor notation.

PREREQUISITES
  • Understanding of tensor notation and Einstein summation convention
  • Familiarity with the Levi-Civita symbol and Kronecker Delta
  • Knowledge of vector calculus, specifically cross products and dot products
  • Basic principles of linear algebra related to vector spaces
NEXT STEPS
  • Study the properties and applications of the Levi-Civita symbol in tensor calculus
  • Learn how to manipulate indices in tensor notation for various vector operations
  • Explore the "epsilon killer" identity and its use in simplifying tensor expressions
  • Practice deriving magnitudes of vector products using tensor notation
USEFUL FOR

Students and professionals in physics and engineering, particularly those working with continuum mechanics, tensor calculus, and vector analysis.

skate_nerd
Messages
174
Reaction score
0
I am new to tensor notation, but have known how to work with vector calculus for a while now. I understand for the most part how the Levi-Civita and Kronecker Delta symbol work with Einstein summation convention. However there are a few things I'm iffy about.
For example, I have a problem where I am to prove
$$\vec{A}\bullet\vec{B}\times\vec{C} = \vec{B}\bullet\vec{C}\times\vec{A} = \vec{C}\bullet\vec{A}\times\vec{B}$$
using tensor notation to avoid having to write out all the terms.
So I know the very left side of this equation would look like
$$\vec{A}\bullet(\vec{B}\times\vec{C}) = A_i (\vec{B}\times\vec{C})_i = A_i \varepsilon_{ijk} B_j C_k$$
But then I get confused when trying to assign the indices for the next two parts of the equation.
Would the second part look like this:
$$\vec{B}\bullet(\vec{C}\times\vec{A}) = B_j (\vec{C}\times\vec{A})_j = B_j \varepsilon_{jkl} C_k A_i$$
Or would the indices of the epsilon be the same as for the first part (\(\varepsilon_{ijk}\))?
Same confusion goes for the first part. The reason I have this uncertainty in my mind is because I know with the triple vector product, you have to introduce 2 extra indices. So I guess my lack of complete understanding of these functions is leaving me confused with my problem. Thanks in advance for any guidance.
 
Physics news on Phys.org
skatenerd said:
I am new to tensor notation, but have known how to work with vector calculus for a while now. I understand for the most part how the Levi-Civita and Kronecker Delta symbol work with Einstein summation convention. However there are a few things I'm iffy about.
For example, I have a problem where I am to prove
$$\vec{A}\bullet\vec{B}\times\vec{C} = \vec{B}\bullet\vec{C}\times\vec{A} = \vec{C}\bullet\vec{A}\times\vec{B}$$
using tensor notation to avoid having to write out all the terms.
So I know the very left side of this equation would look like
$$\vec{A}\bullet(\vec{B}\times\vec{C}) = A_i (\vec{B}\times\vec{C})_i = A_i \varepsilon_{ijk} B_j C_k$$
But then I get confused when trying to assign the indices for the next two parts of the equation.
Would the second part look like this:
$$\vec{B}\bullet(\vec{C}\times\vec{A}) = B_j (\vec{C}\times\vec{A})_j = B_j \varepsilon_{jkl} C_k A_i$$
Or would the indices of the epsilon be the same as for the first part (\(\varepsilon_{ijk}\))?
Same confusion goes for the first part. The reason I have this uncertainty in my mind is because I know with the triple vector product, you have to introduce 2 extra indices. So I guess my lack of complete understanding of these functions is leaving me confused with my problem. Thanks in advance for any guidance.
Even if your re-arrange and your indices are in some order say jik, you can always let this new indices be lmn and then let lmn = ijk since the indices are arbitrary so you are right back to ijk.

Is that what you were asking? That is what I thought from the question. If not, clarify were I lost the point.
 
I'm not sure I made myself clear enough...
What I am unsure of in constructing this equation in tensor notation, is if I take a cross product of two arbitrary vectors B and C, that would be in a certain arbitrary plane. Would the indices of the epsilon symbol be different than if I took the cross product of vectors C and A? Or would it make sense to call them both just \(\varepsilon_{ijk}\)? That just somehow doesn't seem like it would make sense to me, but I'm not sure what else would be correct.
 
I will explain the first equality which may help:
\[
\mathbf{A}\cdot(\mathbf{B}\times\mathbf{C}) = \varepsilon_{ijk}a_ib_jc_k
\]
which is just a summation and a, b, c are the components of the vectors.

We can then write \(\varepsilon_{ijk}a_ib_jc_k = \varepsilon_{jki}b_jc_ka_i\).

We can either do a subsitution or recall that jki doesn't introduce a negative sign since we are still reading right to left. Thus, we can re-write the the equation as
\[
\varepsilon_{ijk}b_ic_ja_k = \mathbf{B}\cdot(\mathbf{C}\times\mathbf{A})
\]
 
Ahhh wow thank you so much. Now I see what you meant by the indices being arbitrary. And seeing now that this proof works helps me conceptualize better what these indices are actually doing.
 
One other quick question, about a slightly different thing if you don't mind...How would you construct a problem in vector notation and find the magnitude of that tensor product? Like for instance the vector A crossed with the vector B? Would it make sense just to dot the product with itself?
 
skatenerd said:
One other quick question, about a slightly different thing if you don't mind...How would you construct a problem in vector notation and find the magnitude of that tensor product? Like for instance the vector A crossed with the vector B? Would it make sense just to dot the product with itself?

I don't quite understand your question. Can you give me an example problem or question?
 
I want to do this in tensor notation:
$$|\vec{A}\times\vec{B}|$$
Magnitude of a cross product of two arbitrary vectors. So the way I know to start is:
$$|\varepsilon_{ijk}A_j B_k|$$
To take this magnitude in vector notation is what I am not sure I understand how to do. Would it make sense to write
$$(\varepsilon_{ijk}A_j B_k)(\varepsilon_{ijk}A_j B_k)$$
and then use the "epsilon killer" identity to simplify it? Not really sure of any other way to notate it.
 
skatenerd said:
I want to do this in tensor notation:
$$|\vec{A}\times\vec{B}|$$
Magnitude of a cross product of two arbitrary vectors. So the way I know to start is:
$$|\varepsilon_{ijk}A_j B_k|$$
To take this magnitude in vector notation is what I am not sure I understand how to do. Would it make sense to write
$$(\varepsilon_{ijk}A_j B_k)(\varepsilon_{ijk}A_j B_k)$$
and then use the "epsilon killer" identity to simplify it? Not really sure of any other way to notate it.
I would just interpret as
\[
\lvert\mathbf{A}\times\mathbf{B}\rvert = \lVert \mathbf{A}\rVert\lVert \mathbf{B}\rVert\sin(\theta)
\]
Is there a specific end goal of this question? Should you come up with a certain expression?
 
  • #10
Well actually the goal of the question is to prove that expression you just wrote, using tensor notation. I was just having a hard time even getting started with that whole idea of taking a magnitude in tensor notation.
 
  • #11
skatenerd said:
Well actually the goal of the question is to prove that expression you just wrote, using tensor notation. I was just having a hard time even getting started with that whole idea of taking a magnitude in tensor notation.
Here is a homework of mine from Continuum Mechs with Tensor problems worked out:
http://ubuntuone.com/4qjtmJJmCXpewCPoKNJhXf
 
  • #12
That really helps a lot. Definitely going to bookmark that, for future reference :D
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K