Index Notation of div(a:b) and div(c^transpose d)

  • Context: Graduate 
  • Thread starter Thread starter chowdhury
  • Start date Start date
  • Tags Tags
    Divergence Tensor
Click For Summary
SUMMARY

The discussion focuses on the index notation for the divergence of products involving tensors, specifically the divergence of a 4th rank tensor and a 2nd rank tensor, as well as a 3rd rank tensor and a vector. The equations presented include div(a:b) = div(c^transpose. d) where a is a 4th rank tensor, b is a 2nd rank tensor, c is a 3rd rank tensor, and d is a vector. The divergence is expressed as F^{\mu\nu}_{\ \ \ ,\nu} = G^{\mu} and includes both partial and covariant derivatives. The discussion clarifies the use of the ":" symbol for summation over repeated indices and the "." symbol for matrix-vector multiplication.

PREREQUISITES
  • Understanding of tensor notation and operations, specifically 4th and 3rd rank tensors.
  • Familiarity with divergence operations in vector calculus.
  • Knowledge of Einstein summation convention and its application in tensor calculus.
  • Basic understanding of covariant and contravariant tensor formulations.
NEXT STEPS
  • Study the properties and applications of 4th rank tensors in continuum mechanics.
  • Learn about the covariant derivative and its significance in general relativity.
  • Explore the implications of Einstein summation convention in higher-dimensional spaces.
  • Investigate the mathematical operations involving transpose of tensors and their interpretations.
USEFUL FOR

Mathematicians, physicists, and engineers working with tensor analysis, particularly in fields such as general relativity and continuum mechanics, will benefit from this discussion.

chowdhury
Messages
34
Reaction score
3
TL;DR
What is the index notation for divergence of tensor?
What is the index notation of divergence of product of 4th rank tensor and second rank tensor?

What is the index notation of divergence of 3rd rank tensor and vector?

div(a:b) = div(c^transpose. d)
Where a = 4th rank tensor, b is second rank tensor, c is 3rd rank tensor and d is a vector.
 
Physics news on Phys.org
A^{\mu\nu\alpha\beta}B_{\alpha\beta}:=F^{\mu\nu}
or
C^{\mu\nu\alpha}D_{\alpha}:=F^{\mu\nu}
and its divergence is
\frac{\partial F^{\mu\nu}}{\partial x^\nu}=F^{\mu\nu}_{\ \ \ ,\nu}:=G^{\mu}
or in GR with covariant derivative
F^{\mu\nu}_{\ \ \ :\nu}:=G^{\mu}
For all these equations you have to appoint which index and which index should be contracted by dummy indexes. The above shown is an example from many other possible ways.
 
Last edited:
Thanks. I am not familiar with the covariant and contra-variant formulations and their manipulations. Can it be written as below? $$ div(a:b) + \frac {\partial^2 G} {\partial t^2} = div(c^{transpose}. d) $$ $$ (a_{ijkl}b_{kl})_{,j} + G_{i,tt}= (c_{ijk}^{transpose} d_{,k}),j $$
$$ (a_{ijkl}b_{kl})_{,j} + G_{i,tt}= (c_{kij} d_{,k}),j $$
 
I am not familiar with the symbols ":" and "." used here. Someone will confirm it.

Is d a scalar as you show gradient ##d,_k## ? I am not sure how to interpret "transpose" for 3 indexes entity as ##c_{ijk}##. Einstein summation convention is usually for 4-spacetime coordinates, i.e. i=0,1,2,3. It may cause confusion to apply it for i=1,2,3 not including t.

I prefer to note ",t,t" than ",tt" for applying time derivative twice but it would be just a matter of taste.
 
Last edited:
1.) ":" means summation over repeated subscripts like $$a_{ijkl} b_{kl}$$ here sum over k and l for allowed.
2.) "." is just a matrix vector multiplication, like $$c_{ijk}d_{k}$$ summ over all allowed k.
3.) It is certainly allowable for index i, j,k,l to include with or without 0, as depending on the problem, here in my case, space, these are from the set of {x,y,z} or {1,2,3}, and 0 does not exist, as I exclusively denote time
4.) I mentioned in my original post d is a vector, for 3D, it is a (3x1) vector, for 2D it is a (2x1) vector.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
881
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
865
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 18 ·
Replies
18
Views
18K
Replies
3
Views
3K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K