Proving vector calculus identities using the levi-civitia symbol

Click For Summary

Homework Help Overview

The discussion revolves around proving a vector calculus identity involving the divergence of the cross product of two vector fields, \(\nabla \bullet (\textbf{A} \times \textbf{B})\), using the Levi-Civita symbol and Einstein summation convention. Participants are exploring the mathematical framework and implications of this identity within the context of vector calculus.

Discussion Character

  • Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss the application of the Levi-Civita symbol in the context of the identity, with some questioning the implications of repeated indices and the behavior of partial derivatives. There are suggestions to express the identity in terms of the Levi-Civita symbol and manipulate it to find connections between both sides of the equation.

Discussion Status

The discussion is ongoing, with participants providing insights and clarifications regarding the manipulation of indices and the properties of the Levi-Civita symbol. Some participants are attempting to rearrange terms and explore the implications of their manipulations, while others are questioning assumptions and interpretations of the mathematical expressions involved.

Contextual Notes

Participants are navigating potential confusion regarding the differentiation of vector components and the application of the Levi-Civita symbol, indicating a need for clarity on these mathematical concepts. There is also mention of issues with formatting in the discussion, which may affect the readability of mathematical expressions.

CmdrGuard
Messages
5
Reaction score
0

Homework Statement


Prove \nabla \bullet (\textbf{A} \times \textbf{B}) = \textbf{B} \bullet (\nabla \times \textbf{A}) - \textbf{A} \bullet (\nabla \times \textbf{B})

I'd like to prove this using the levi-civitia symbol: \epsilon_{ijk} and einstein-summation convention as practice and because it seems the most elegant way to deal with problems like these.

Homework Equations


(\textbf{A} \times \textbf{B})_i = \epsilon_{ijk} a_j b_k

The Attempt at a Solution



What follows is what I get so far:

\nabla \bullet (\textbf{A} \times \textbf{B}) = \frac{\partial}{\partial x_i}(\textbf{A} \times \textbf{B})_i = \frac{\partial}{\partial x_i} \epsilon_{ijk} A_j B_k = \epsilon_{ijk} \left[ A_j \frac{\partial B_k}{\partial x_i} + B_k \frac{\partial A_j}{\partial x_i} \right]

Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?
 
Last edited:
Physics news on Phys.org
tex doesn't seem to be working, so i can;t read your post...

first call nabla g then you want to show
g.(AxB) = B.(gxA) - A.(xB)

so first express in terms of
g.(AxB) = (d/dx_k)(e_ijk A_i B_j)

then expand using product rule
 
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum?
The levi-civitia symbol sets repeated indices to zero, so it would be the opposite of what you say, in the sense that when k or j do equal i, the element goes to zero. I think you might have mistaken the levi-civitia symbol for the kronecker delta symbol...

That goes for what you're doing wrong, but as for the proof, I tried for a little bit to do it and sort of got somewhere, but it probably wouldn't help much. I'd suggest writing out the right hand side in terms of the einstein summation and the lc symbol and then manipulating the two until you start to see the similarities, that will probably guide you as to the correct connecting path between the two sides.

Although I think after looking at it that this way of manipulating the identity is perhaps not the most elegant way when the standard proof is probably much simpler.
 
ok so now i can read the tex
<br /> \nabla \bullet (\textbf{A} \times \textbf{B}) <br /> = \frac{\partial}{\partial x_i}(\textbf{A} \times \textbf{B})_i <br /> = \frac{\partial}{\partial x_i} \epsilon_{ijk} A_j B_k <br /> = \epsilon_{ijk} \left[ A_j \frac{\partial B_k}{\partial x_i} + B_k \frac{\partial A_j}{\partial x_i} \right]<br />

now i would continue rearranging as
<br /> = \epsilon_{ijk} A_j \frac{\partial B_k}{\partial x_i} + \epsilon_{ijk}B_k \frac{\partial A_j}{\partial x_i}<br />

then to clarify change some dummy indicies and rearrange for clarity
<br /> = \epsilon_{ijk} A_j \frac{\partial B_k}{\partial x_i} + \epsilon_{mnp}B_p \frac{\partial A_n}{\partial x_m}<br />
<br /> = A_j (\epsilon_{ijk} \frac{\partial }{\partial x_i} B_k)+ B_p (\epsilon_{mnp}\frac{\partial }{\partial x_m} A_n)<br />

then a have a think about the definition of the cross product in index notation, in particular the ordering of indexes.
 
Last edited:
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?

not too sure what you mean here, but my answer is no.

Think of A_k = A_k(x_1, x_2, x_3) then differentiate. The fact that both i & k are repeated in the levi cevita symbol means you sum over that index
 
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?
You're confusing
\frac{\partial A_j}{\partial x_i}
with
\frac{\partial x_j}{\partial x_i}
The latter is equal to δij, so only the i=j terms survive when you sum over one of the indices. But that's not what you have in this problem.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K