Proving vector calculus identities using the levi-civitia symbol

CmdrGuard
Messages
5
Reaction score
0

Homework Statement


Prove \nabla \bullet (\textbf{A} \times \textbf{B}) = \textbf{B} \bullet (\nabla \times \textbf{A}) - \textbf{A} \bullet (\nabla \times \textbf{B})

I'd like to prove this using the levi-civitia symbol: \epsilon_{ijk} and einstein-summation convention as practice and because it seems the most elegant way to deal with problems like these.

Homework Equations


(\textbf{A} \times \textbf{B})_i = \epsilon_{ijk} a_j b_k

The Attempt at a Solution



What follows is what I get so far:

\nabla \bullet (\textbf{A} \times \textbf{B}) = \frac{\partial}{\partial x_i}(\textbf{A} \times \textbf{B})_i = \frac{\partial}{\partial x_i} \epsilon_{ijk} A_j B_k = \epsilon_{ijk} \left[ A_j \frac{\partial B_k}{\partial x_i} + B_k \frac{\partial A_j}{\partial x_i} \right]

Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?
 
Last edited:
Physics news on Phys.org
tex doesn't seem to be working, so i can;t read your post...

first call nabla g then you want to show
g.(AxB) = B.(gxA) - A.(xB)

so first express in terms of
g.(AxB) = (d/dx_k)(e_ijk A_i B_j)

then expand using product rule
 
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum?
The levi-civitia symbol sets repeated indices to zero, so it would be the opposite of what you say, in the sense that when k or j do equal i, the element goes to zero. I think you might have mistaken the levi-civitia symbol for the kronecker delta symbol...

That goes for what you're doing wrong, but as for the proof, I tried for a little bit to do it and sort of got somewhere, but it probably wouldn't help much. I'd suggest writing out the right hand side in terms of the einstein summation and the lc symbol and then manipulating the two until you start to see the similarities, that will probably guide you as to the correct connecting path between the two sides.

Although I think after looking at it that this way of manipulating the identity is perhaps not the most elegant way when the standard proof is probably much simpler.
 
ok so now i can read the tex
<br /> \nabla \bullet (\textbf{A} \times \textbf{B}) <br /> = \frac{\partial}{\partial x_i}(\textbf{A} \times \textbf{B})_i <br /> = \frac{\partial}{\partial x_i} \epsilon_{ijk} A_j B_k <br /> = \epsilon_{ijk} \left[ A_j \frac{\partial B_k}{\partial x_i} + B_k \frac{\partial A_j}{\partial x_i} \right]<br />

now i would continue rearranging as
<br /> = \epsilon_{ijk} A_j \frac{\partial B_k}{\partial x_i} + \epsilon_{ijk}B_k \frac{\partial A_j}{\partial x_i}<br />

then to clarify change some dummy indicies and rearrange for clarity
<br /> = \epsilon_{ijk} A_j \frac{\partial B_k}{\partial x_i} + \epsilon_{mnp}B_p \frac{\partial A_n}{\partial x_m}<br />
<br /> = A_j (\epsilon_{ijk} \frac{\partial }{\partial x_i} B_k)+ B_p (\epsilon_{mnp}\frac{\partial }{\partial x_m} A_n)<br />

then a have a think about the definition of the cross product in index notation, in particular the ordering of indexes.
 
Last edited:
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?

not too sure what you mean here, but my answer is no.

Think of A_k = A_k(x_1, x_2, x_3) then differentiate. The fact that both i & k are repeated in the levi cevita symbol means you sum over that index
 
CmdrGuard said:
Now don't the partial derivatives with respect to x_i force k = i in the first sum and j = i in the second sum? I must be doing something wrong because if so, then this equality becomes zero which is not correct.

What am I doing wrong?
You're confusing
\frac{\partial A_j}{\partial x_i}
with
\frac{\partial x_j}{\partial x_i}
The latter is equal to δij, so only the i=j terms survive when you sum over one of the indices. But that's not what you have in this problem.
 
Thread 'Use greedy vertex coloring algorithm to prove the upper bound of χ'
Hi! I am struggling with the exercise I mentioned under "Homework statement". The exercise is about a specific "greedy vertex coloring algorithm". One definition (which matches what my book uses) can be found here: https://people.cs.uchicago.edu/~laci/HANDOUTS/greedycoloring.pdf Here is also a screenshot of the relevant parts of the linked PDF, i.e. the def. of the algorithm: Sadly I don't have much to show as far as a solution attempt goes, as I am stuck on how to proceed. I thought...
Back
Top