• Support PF! Buy your school textbooks, materials and every day products Here!

Tensor calculus, gradient of skew tensor

  • Thread starter Telemachus
  • Start date
  • #1
832
30
Hi there. I was dealing with the derivation on continuum mechanics for the conservation of angular momentum. The derivation I was studying uses an arbitrary constant skew tensor ##\Lambda##. It denotes by ##\lambda## its axial vector, so that ##\Lambda=\lambda \times##

Then it defines ##w(x)=\lambda \times r=\Lambda r##

So that ##grad (\Lambda r)=\Lambda##

And thats the doubt I have.

When I do ##grad (\Lambda r)## I have (I use that ##r=x-x_0##):

##\displaystyle grad (\Lambda r)=\frac{\partial}{\partial x_k} ( \Lambda_{ij} r_j ) = \frac{\partial \Lambda_{ij} } {\partial x_k} r_j + \frac {\partial r_j} {\partial x_k} \Lambda_{ij} = \frac{\partial \Lambda_{ij}} {\partial x_k} (x_j-x_{0j})+\frac{\partial (x_j-x_{0j})}{\partial x_k}\Lambda_{ij}=(grad \Lambda ) r+\Lambda ##

Now, the fact that ##\Lambda## was constant determines that the gradient is zero? that was the doubt, I recognize that I didn't noticed before the fact that the tensor was constant until I written this post :p
 
Last edited:

Answers and Replies

Related Threads on Tensor calculus, gradient of skew tensor

Replies
2
Views
2K
  • Last Post
Replies
0
Views
3K
  • Last Post
Replies
0
Views
1K
Replies
5
Views
1K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
4
Views
1K
Replies
1
Views
977
Replies
2
Views
1K
  • Last Post
Replies
1
Views
479
  • Last Post
Replies
2
Views
2K
Top