# The gradient of a tensor

1. Jan 7, 2016

### EsmeeDijk

1. The problem statement, all variables and given/known data
We have the following orthogonal tensor in R3:
$t_{ij} (x^2) = a (x^2) x_i x_j + b(x^2) \delta _{ij} x^2 + c(x^2) \epsilon_ {ijk} x_k$
Calculate the following quantities and simplify your expression as much as possible:
$\nabla _j t_{ij}(x)$
and
$\epsilon _{ijk} \nabla _i t_{jk}(x) = 0$​

2. Relevant equations
The equations given in my book are:
$(\nabla f)_i = \Lambda _{ji} \frac{\partial f}{\partial x_j}$ ( with a tilda on the last xj
$\nabla _i = \Lambda_i^j \nabla _j$ (with a tilda "~" on the last nabla)
3. The attempt at a solution
My problem is that these equations that I have are all assuming that you have a tensor in the form of a matrix, but this is not the case I believe. Also in the book leading up to these equations you have a vector x which is dependent on xi and on ei. Which is now also not the case. Only the first term with a is dependent on xi or xj, but I can't imagine that the rest of the function just falls away..

2. Jan 7, 2016

### andrewkirk

What do $x,x^2$ and $x_i$ represent in your formula?
Does $t_{ij}(x^2)$ represent the $i,j$ component of the tensor, in some assumed (but unstated) basis, calculated in terms of a parameter $x^2$? Or does it represent the application of an order-2 tensor to a vector denoted by the symbol $x^2$?