Proving vector calculus identities using the levi-civitia symbol

Click For Summary
SUMMARY

The forum discussion centers on proving the vector calculus identity \(\nabla \bullet (\textbf{A} \times \textbf{B}) = \textbf{B} \bullet (\nabla \times \textbf{A}) - \textbf{A} \bullet (\nabla \times \textbf{B})\) using the Levi-Civita symbol \(\epsilon_{ijk}\) and the Einstein summation convention. Participants clarify the use of partial derivatives and the properties of the Levi-Civita symbol, emphasizing that repeated indices in the symbol lead to zero contributions when they are equal. The discussion suggests that while manipulating the identity using the Levi-Civita symbol is valid, a more straightforward proof may exist.

PREREQUISITES
  • Understanding of vector calculus identities
  • Familiarity with the Levi-Civita symbol \(\epsilon_{ijk}\)
  • Knowledge of the Einstein summation convention
  • Proficiency in partial differentiation
NEXT STEPS
  • Study the properties of the Levi-Civita symbol in depth
  • Learn about vector calculus identities and their proofs
  • Explore the Einstein summation convention and its applications
  • Review the product rule for differentiation in vector calculus
USEFUL FOR

Students and professionals in mathematics, physics, and engineering who are working with vector calculus and need to understand the manipulation of vector identities using advanced mathematical tools.

CmdrGuard
Messages
5
Reaction score
0

Homework Statement


Prove \nabla \bullet (\textbf{A} \times \textbf{B}) = \textbf{B} \bullet (\nabla \times \textbf{A}) - \textbf{A} \bullet (\nabla \times \textbf{B})

I'd like to prove this using the levi-civitia symbol: \epsilon_{ijk} and einstein-summation convention as practice and because it seems the most elegant way to deal with problems like these.

Homework Equations


(\textbf{A} \times \textbf{B})_i = \epsilon_{ijk} a_j b_k

The Attempt at a Solution



What follows is what I get so far:

\nabla \bullet (\textbf{A} \times \textbf{B}) = \frac{\partial}{\partial x_i}(\textbf{A} \times \textbf{B})_i = \frac{\partial}{\partial x_i} \epsilon_{ijk} A_j B_k = \epsilon_{ijk} \left[ A_j \frac{\partial B_k}{\partial x_i} + B_k \frac{\partial A_j}{\partial x_i} \right]

Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?
 
Last edited:
Physics news on Phys.org
tex doesn't seem to be working, so i can;t read your post...

first call nabla g then you want to show
g.(AxB) = B.(gxA) - A.(xB)

so first express in terms of
g.(AxB) = (d/dx_k)(e_ijk A_i B_j)

then expand using product rule
 
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum?
The levi-civitia symbol sets repeated indices to zero, so it would be the opposite of what you say, in the sense that when k or j do equal i, the element goes to zero. I think you might have mistaken the levi-civitia symbol for the kronecker delta symbol...

That goes for what you're doing wrong, but as for the proof, I tried for a little bit to do it and sort of got somewhere, but it probably wouldn't help much. I'd suggest writing out the right hand side in terms of the einstein summation and the lc symbol and then manipulating the two until you start to see the similarities, that will probably guide you as to the correct connecting path between the two sides.

Although I think after looking at it that this way of manipulating the identity is perhaps not the most elegant way when the standard proof is probably much simpler.
 
ok so now i can read the tex
<br /> \nabla \bullet (\textbf{A} \times \textbf{B}) <br /> = \frac{\partial}{\partial x_i}(\textbf{A} \times \textbf{B})_i <br /> = \frac{\partial}{\partial x_i} \epsilon_{ijk} A_j B_k <br /> = \epsilon_{ijk} \left[ A_j \frac{\partial B_k}{\partial x_i} + B_k \frac{\partial A_j}{\partial x_i} \right]<br />

now i would continue rearranging as
<br /> = \epsilon_{ijk} A_j \frac{\partial B_k}{\partial x_i} + \epsilon_{ijk}B_k \frac{\partial A_j}{\partial x_i}<br />

then to clarify change some dummy indicies and rearrange for clarity
<br /> = \epsilon_{ijk} A_j \frac{\partial B_k}{\partial x_i} + \epsilon_{mnp}B_p \frac{\partial A_n}{\partial x_m}<br />
<br /> = A_j (\epsilon_{ijk} \frac{\partial }{\partial x_i} B_k)+ B_p (\epsilon_{mnp}\frac{\partial }{\partial x_m} A_n)<br />

then a have a think about the definition of the cross product in index notation, in particular the ordering of indexes.
 
Last edited:
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?

not too sure what you mean here, but my answer is no.

Think of A_k = A_k(x_1, x_2, x_3) then differentiate. The fact that both i & k are repeated in the levi cevita symbol means you sum over that index
 
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?
You're confusing
\frac{\partial A_j}{\partial x_i}
with
\frac{\partial x_j}{\partial x_i}
The latter is equal to δij, so only the i=j terms survive when you sum over one of the indices. But that's not what you have in this problem.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 0 ·
Replies
0
Views
986
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K