The gradient of a tensor

  • Thread starter EsmeeDijk
  • Start date
  • #1
5
0

Homework Statement


We have the following orthogonal tensor in R3:
[itex] t_{ij} (x^2) = a (x^2) x_i x_j + b(x^2) \delta _{ij} x^2 + c(x^2) \epsilon_ {ijk} x_k [/itex]
Calculate the following quantities and simplify your expression as much as possible:
[itex] \nabla _j t_{ij}(x)[/itex]
and
[itex] \epsilon _{ijk} \nabla _i t_{jk}(x) = 0 [/itex]​

Homework Equations


The equations given in my book are:
[itex] (\nabla f)_i = \Lambda _{ji} \frac{\partial f}{\partial x_j} [/itex] ( with a tilda on the last xj
[itex] \nabla _i = \Lambda_i^j \nabla _j [/itex] (with a tilda "~" on the last nabla)

The Attempt at a Solution


My problem is that these equations that I have are all assuming that you have a tensor in the form of a matrix, but this is not the case I believe. Also in the book leading up to these equations you have a vector x which is dependent on xi and on ei. Which is now also not the case. Only the first term with a is dependent on xi or xj, but I can't imagine that the rest of the function just falls away..
 

Answers and Replies

  • #2
andrewkirk
Science Advisor
Homework Helper
Insights Author
Gold Member
3,855
1,435
Your notation is unfamiliar.
What do ##x,x^2## and ##x_i## represent in your formula?
Does ##t_{ij}(x^2)## represent the ##i,j## component of the tensor, in some assumed (but unstated) basis, calculated in terms of a parameter ##x^2##? Or does it represent the application of an order-2 tensor to a vector denoted by the symbol ##x^2##?
 

Related Threads on The gradient of a tensor

Replies
0
Views
4K
Replies
9
Views
1K
Replies
1
Views
1K
  • Last Post
Replies
4
Views
45K
  • Last Post
Replies
8
Views
63K
Replies
7
Views
515
  • Last Post
Replies
5
Views
15K
Replies
1
Views
863
Replies
5
Views
2K
  • Last Post
Replies
1
Views
1K
Top