# Proof of Gradient Dot Product Identity

Hey guys, this is for my classical E&M class but it's more of a math problem.

## Homework Statement

Show: $∇(\vec{A} . \vec{B}) = \vec{B} \times (∇ \times \vec{A}) + (\vec{B} \times ∇)\vec{A} + \vec{A} \times (∇ \times \vec{B}) + (\vec{A} \times ∇)\vec{B}$

## Homework Equations

I tried tackling this sucker using cartesian coordinates but that's not the way to go. I believe the way to go is to use Levi-Civita Tensors and Delta Functions (he introduced us to them on the first day of class but I don't understand how to manipulate them):

$\vec{A} . \vec{B} = A_{i}B_{j}δ_{ij}$

$\vec{A} \times \vec{B} = ε_{ijk}A_{j}B_{k}$

## The Attempt at a Solution

So I've been at this thing for hours (and this is the second time I've tried posting it my computer got funny after the first time - this is gonna be fun to Latex in again :( )

Here's what I've got:

$LHS = ∇(A_{i}B{j}δ_{ij})$

$RHS = (B_{i}A_{j}δ_{ij})∇ - (B_{i}∂_{j}δ_{ij})\vec{A} + (ε_{ijk}B_{j}∂_{k})\vec{A} + (A_{i}B_{j}δ_{ij})∇ - (A_{i}∂_{j}δ_{ij})\vec{B} + (ε_{ijk}A_{j}∂_{k})\vec{B}$

=> $RHS = 2∇(A_{i}B_{j}δ_{ij}) = (ε_{ijk}B_{j}∂_{k} - B_{i}∂_{j}δ_{ij})\vec{A} + (ε_{ijk}A_{j}∂_{k} - A_{i}∂_{j}δ_{ij})\vec{B}$

This is the first part of the problem, the second asks to show another identity but if someone could show me / I could figure out how to solve this one I'd definitely be able to solve the second part. I just don't really get how to work with these Delta Functions and Levi-Civita Tensors.

Thanks!

Just got an email from the Prof. Apparently the HW problem I'm stuck on was written wrong (Dot products in the 2nd and 4th terms instead of Cross products). Hopefully I'll be able to figure it out now. If not I'll let you know.

Thanks

Since the cross product is involved (and you're doing EM), I will assume that everything is three dimensional. Let us start with the kronecker delta, since it is easiest. Let $A = (A_1, A_2, A_3), B=(B_1,B_2,B_3)$. By definition, we have that
$$\delta_{ij} = \begin{cases} 1 & i = j \\ 0 & i \neq j \end{cases}$$
and the presence of a double index means that we sum over the indices. Hence when we write $\delta_{ij} A_i B_j$ what we really mean is $\sum_{i=1}^3 \sum_{j=1}^3 \delta_{ij} A_i B_j$. Let us expand this.

\begin{align*} \sum_{j=1}^3 \sum_{i=1}^3 \delta_{ij} A_i B_j &= \sum_{j=1}^3 \left[ \delta_{1j} A_1 B_j + \delta_{2j} A_2 B_j + \delta_{3j} A_3 B_j \right] \\ &= (\delta_{11} A_1 B_1 + \delta_{12} A_1 B_2 + \delta_{13} A_1 B_3) + (\delta_{21} A_2 B_1 + \delta_{22} A_2 B_2 + \delta_{23} A_2 B_3) + (\delta_{31} A_3 B_1 + \delta_{32} A_3 B_2 + \delta_{33} A_3 B_3) \end{align*}
But by definition, the only non-zero delta will occur for $\delta_{11}, \delta_{22}, \delta_{33}$ telling us that $\delta_{ij} A_i B_j = A_1 B_1 + A_2 B_2 + A_3 B_3$ which we may also just write as $A_i B_i$ if you're summing double indices. Note that this is precisely what we expect the dot product to be!

Hence you can cheat! Any time you see a $\delta_{ij}$ in an expression, just "get rid of the delta" by replacing every occurrence of i with j (or vice versa). For example,
$\delta_{ij} \delta_{k\ell} A_i B_j C_k D_\ell = A_i B_i C_k D_k$.

The Levi-Civita symbol is slightly more complicated. Working out the computation by hand would be a good exercise, but let me give you the "cheating" way. $\epsilon_{ijk}$ is zero if any index is repeated right? Hence you only need to worry about when each of the i,j, and k are dinstinct. What possibilities are there?

(1,2,3), (1,3,2), (2,1,3), (2,3,1), (3,1,2), (3,2,1)

so these indices are the only non-zero ones. Now we just need to worry about sign. The way to do this is to figure out whether the permutation is even or odd. An easy way of doing this is to count how many transpositions one would need to get to (1,2,3). For example, consider (3,1,2). Let's figure out if this is even or odd. We are only allowed to switch adjacent numbers, where the first and the last numbers are considered adjacent. One possible way of doing this is

$$(3,1,2) \mapsto (1,3,2) \mapsto (1,2,3)$$
where in the first case I switched 3 and 1, and in the second I switched 3 and 2. Hence it took 2 transpositions to do this, implying the permutation is even (since 2 is even). It turns out that no matter what transpositions you use, you'll always get an even number!

Now in $\epsilon_{ijk} A_i B_j$ we do not sum over the k component right? That means we will have
$$\epsilon_{ijk} A_i B_j = (A_1 B_2 - A_2 B_1) \epsilon_{12k} + (A_2 B_3 - A_3 B_2) \epsilon_{23k} + (A_3 B_1 - A_1 B_3) \epsilon_{31k}$$
So in this case, the presence of the k term tells us precisely what component of the vector you have. Namely, look at the first component
$$(A_1 B_2 - A_2 B_1) \epsilon_{12k}$$
By definition of $\epsilon_{12k}$, this will only be non-zero if $k= 3$. That means that this is the third component of the vector. Similarily for the other two. You will see that this agrees exactly with the cross product, as we assumed.

Thank you, that was very helpful and explained very well. Here's the issue I'm butting up against in the problem. I have the identity written down in my notes but I don't get it. I'm getting the left hand side down to (same as above):

$∇(A_{i}B_{j}δ_{ij})$ (Keeping the deltas in because I think they become relevant for the RHS).

And the RHS down to:

$ε_{imn}B_{i}(ε_{ijk}∂_{i}A_{j})_{m} + ε_{imn}A_{i}(ε_{ijk}∂_{i}B_{j})_{m} + (B_{i}∂_{j}δ_{ij})\vec{A} + (A_{i}∂_{j}δ_{ij})\vec{B}$

I think that's correct but I don't know how to evaluate/manipulate the two Levi-Civitas in a row like above.

Thanks again!

I find your notation to be a touch bit confusing. It might help if we work out one of the components together. Let's work on the term $(B\times \nabla)\cdot A$. Write $A = (A_1,A_2,A_3), B=(B_1,B_2,B_3), \nabla = (\partial_1,\partial_2,\partial_3)$. First, let us figure out what $B \times \nabla$ look like. As per your definition above, we know that
$$B \times \nabla = \epsilon_{ijk} B_j \nabla_k.$$
The important think to realize here is that while the "j" and "k" indices have been summed over, the "i" index is still free. Namely, $B\times\nabla$ looks like something with only one free index (the "i" index) to sum over, so let us write $C_i = \epsilon_{ijk} B_j \nabla_k$. Thus if $C = (C_1,C_2,C_3)$ our objective is now to find
$$(B \times \nabla) \cdot A = C \cdot A$$
but we know how to do this. That is, we have that $C \cdot A = \delta_{i\ell} C_i A_\ell$.

Now I would like to note here that we are about to write $C_i$ in terms of $B_j, \partial_k$, and if we want to keep our notation straight, we should ensure that we do not use either j,k in our index for A.

Hence we have
\begin{align*} (B \times \nabla)\cdot A &= C\cdot A = \delta_{i\ell} C_i A_\ell \\ &= \delta_{i\ell} (\epsilon_{ijk} B_j \partial_k) A_\ell \\ &= \delta_{i\ell} \epsilon_{ijk} B_j \partial_k A_\ell \end{align*}
You could leave it like this, or you could actually evaluate this. We already know that if we see $\delta_{i\ell}$ we can simply replace all occurrences of i with that of $\ell$, yielding
$$\delta_{i\ell} \epsilon_{ijk} B_j \partial_k A_\ell = \epsilon_{ijk} B_j \partial_k A_i .$$
Now it may be best to leave it like this for now, and evaluate other expressions. Likely things will cancel. On the other hand, you could write it out to see that
$$\epsilon_{ijk} B_j \partial_k A_i = B_2 \partial_3 A_1 - B_1 \partial_3 A_2 + B_3 \partial_1 A_2 - B_2 \partial_1 A_3 + B_1 \partial_2 A_3 - B_3 \partial_2 A_1$$