# Prove the following tensor identity

1. Sep 14, 2013

### TheFerruccio

I am back again, with more tensor questions. I am getting better at this, but it is still a tough challenge of pattern recognition.
Problem Statement
Prove the following identity is true, using indicial notation:

$\nabla\times(\nabla \vec{v})^T = \nabla(\nabla\times\vec{v})$

Attempt at Solution

Let $U_{kp} = v_{k,p}$

Then LHS:
$\nabla\times(U_{kp})^T = \varepsilon_{ijk}U_{pk,j}$

RHS:

$\nabla(\varepsilon_{ijk}v_{k,j})=\nabla(\varepsilon_{ijk}U_{kj})=\varepsilon_{ijk}U_{kj,p}$

I understand that I can swap dummy indices around however I like, and simply rename free indices, but, even then, RHS does not equal LHS, and I do not know what I am doing wrong. This is the final result I get.

$\varepsilon_{ijk}U_{pk,j}=\varepsilon_{ijk}U_{kj,p}$

I know that I am not supposed to change both sides, but I was doing this for reference, to get an idea of where I was going. I was going to reconstruct to end up with the RHS. What am I doing wrong?

2. Sep 14, 2013

### fzero

I'm not quite familiar with the notation here. I assume that

$$[(\nabla \vec{v})^T]_{kp} = (\partial_k v_{p})^T = (v_{p,k})^T = v_{k,p},$$

but the next question is which index the $\nabla\times$ is supposed to act on. It seems from the result that it acts on the index of $\vec{v}$. With these conventions, the identity follows easily.

I think your indices are in the wrong place. Write

$$U_{kp} = \partial_k v_p,$$

i.e., it is natural to write the derivative on the left, so that should be the leftmost index. After sorting that out, we can use the comma notation for the derivative $\partial_k v_p = v_{p,k}$.

If you can check that my interpretation of the notation is correct, then you can write both expressions in terms of $v_{k,jp} = v_{k,pj}$ and see the equality.

3. Sep 14, 2013

### TheFerruccio

I don't think my indices are messed up. I defined the condition whereby the matrix relates to the comma notation in that way, so, I think I am right in that aspect. I understand your last term in your post, where you explained that the derivatives can be swapped. That is precisely what I am trying to get to, and what I have spent hours racking my brain over trying to understand. I have spent about 8 hours on this problem so far.

4. Sep 14, 2013

### fzero

I can sympathize. I can't make your version work, but you can check that mine does. I don't know how the notation was introduced in your text/course.

5. Sep 14, 2013

### vela

Staff Emeritus
Are you sure? It looks like with your convention, the LHS is zero because
$$\varepsilon_{ijk} U_{pk,j} = \varepsilon_{ijk} (\partial_k v_p)_{,j} = \varepsilon_{ijk} \partial_j\partial_k v_p = 0.$$

6. Sep 15, 2013

### TheFerruccio

I don't understand why that would be zero whatsoever. Why is that last term zero? It looks like you are simply increasing the order of the tensor with each derivative.

Also, that's another thing. I've been downloading tons of PDFs and reading multiple textbooks and everyone seems to have a different convention for how to do this. It's driving me up the wall, and each time someone provides an explanation, a new convention is also introduced. The whole point of this assignment is for me to understand a new convention, but, in my attempts for weeks to understand this convention, I have been also introduced to more. I cannot seem to solve this problem. I have never had so much trouble with a mathematical topic in my life. This is a really simple problem, I'm sure, but I cannot convey it to anyone, because no one knows the yet-another convention that I am using in my class.

7. Sep 15, 2013

### vela

Staff Emeritus
Because $\partial_j$ and $\partial_k$ commute.

8. Sep 15, 2013

### TheFerruccio

Right. It's that commutation of $\partial_j$ and $\partial_k$ that I wish to arrive at at the end of the problem. However, I do not see how the commutation of the two operators would mean $\varepsilon_{ijk} \partial_j\partial_k v_p = 0$. You're operating on something with a p index with an operator of a k index, which raises the order. All the indices are unique. What am I missing?

 Well, it's a summation over j and k, such that j and k are different from each other (in this case)...

The indices for the positive terms would be p,12, p,23, p,31, and the negative terms would be p,21, p,32, p,13 Thus, because the operators commute, then the whole term would be zero. I think I understand it now. Is that what you mean? Or, is there a simpler way for me to understand this that doesn't mean writing it out? I've been doing this for weeks. I can't see why I didn't see that. I apologize.

9. Sep 15, 2013

### TheFerruccio

Do you guys have any good PDF documentation on indicial notation? I already have some, but none of them are in a notation I understand. Whenever we start dealing with gradients of higher order tensors (2+), the book assumes that I understand covariance and contravariance in the context of tensors, and uses upper/lower indices, which we haven't gotten to whatsoever. All the other decent indicial help I've found solely focuses on vectors, which I understand very well, now. I haven't been taught the superscript/subscript covariance and contravariance in tensors for oblique coordinate systems, nor have I been taught tensor products explicitly. So, with my limited toolset, I feel like I am getting nowhere with this assignment.

10. Sep 15, 2013

### TheFerruccio

Holy crap, finally figured it out. It looks like I had the right technique all along, but I was just stricken, over and over, for hours, with mixing up the indices throughout my operation. Clearly, I need a better method of pursuing this that isn't so confusing. This is precisely the method that was taught in class, and it's just resulting in crazy amounts of confusion for me. For now, it feels like memorizing formulas.

Solution:

I start out with $\nabla\times(\nabla\vec{v})^T=\nabla(\nabla\times\vec{v})$

LHS:
$\nabla\vec{v}\rightarrow v_{i,j}=A_{ij}$
$(\nabla\vec{v})^T\rightarrow B_{ij}=A_{ji}=v_{j,i}$
Now, this is the part that I kept stumbling over a thousand times...
$\nabla\times(\nabla\vec{v})^T=\varepsilon_{akj}B_{ij,k}$
$=\varepsilon_{akj}(A_{ji})_{,k}$
$=\varepsilon_{akj}v_{j,ik}$
Since the commutator of the partial operator is 0, I can swap the i and k'th index.
$=\varepsilon_{akj}v_{j,ki}$

RHS
$\nabla\times\vec{v}\rightarrow\varepsilon_{ijk}v_{k,j}$
$\nabla(\nabla\times\vec{v})\rightarrow\varepsilon_{ijk}v_{k,jb}$

The two resultant statements are equal. I just need to make the following swaps with RHS to match LHS:
$i\rightarrow a$
$k\rightarrow j$
$j\rightarrow k$
$b\rightarrow i$

Then the two statements are equal. I was being rather exhaustive, because, every time I tried to do it in the "two lines" that my professor stated, I ended up mixing up the indices all too quickly.

11. Sep 15, 2013

### vela

Staff Emeritus
Yeah, that's the idea. In general, the product of an antisymmetric object and a symmetric object is 0.

Last edited: Sep 15, 2013
12. Sep 15, 2013

### TheFerruccio

There is another tensor identity that I am trying to solve, but, before even going into indicial notation, I see a problem.

The identity is as follows, given $\textbf{T}$ is a tensor and $\vec{v}$ is a vector:

$\nabla\cdot(\textbf{T}^\top\vec{v})=\vec{v}\cdot(\nabla\textbf{T})+ \textbf{T} \cdot \nabla \vec{v}$

The first term (all of LHS) is the divergence of a tensor * a vector, so it's the divergence of a vector, making it a scalar.

The first term on the RHS is the divergence of a tensor, which is a vector, then another divergence, which is a scalar, great. That's consistent.

The second term on the RHS is confusing me. I am not sure what the order of operations should be. If I take the gradient of a vector, I end up with a 2nd order tensor. The divergence of a second order tensor is a vector.

So, LHS is a scalar, and RHS is a scalar + a vector. I am wrong somewhere.

13. Sep 16, 2013

### TheFerruccio

It turns out that the in-class notation assumes that a single dot product is a double dot product between two tensors, so $\textbf{T}\cdot\textbf{T}=T_{ij}T_{ij}$ so that answers the question I had.