Divergence of second-order Tensor

  • Thread starter Thread starter paccali
  • Start date Start date
  • Tags Tags
    Divergence Tensor
Click For Summary

Homework Help Overview

The discussion revolves around calculating the divergence of a second-order tensor defined as \(\sigma_{ij}(x_{i})=\sigma_{0}x_{i}x_{j}\). Participants are exploring the mathematical implications and notation involved in this tensor operation.

Discussion Character

  • Mixed

Approaches and Questions Raised

  • One participant attempts to compute the divergence using matrix form and expresses uncertainty about the correctness of their approach. Another participant points out a potential misunderstanding regarding the nature of the result, suggesting that the divergence should yield a vector rather than a tensor. A further post introduces an alternative formulation of the tensor and questions the validity of the approach taken.

Discussion Status

The discussion is ongoing, with participants raising questions about tensor notation and the implications of summation in the calculations. Some guidance has been offered regarding the nature of the divergence result, but no consensus has been reached on the correct approach or solution.

Contextual Notes

Participants note that tensor notation implies summation symbols, which may lead to confusion in the calculations. There is also an indication that the problem may extend beyond the original poster's expertise, prompting suggestions to seek further clarification.

paccali
Messages
6
Reaction score
0

Homework Statement


Calculate the Divergence of a second-order tensor:

\sigma _{ij}(x_{i})=\sigma_{0}x_{i}x_{j}

Homework Equations



\bigtriangledown \cdot \sigma_{ij}=\sigma_{ij'i}

The Attempt at a Solution



\sigma_{ij'i}=\frac{\partial }{\partial x_{i}}\cdot\sigma_{0}x_{i}x_{j}
=\sigma_{0}(x_{j})

I'm not sure if this is correct. When I put it into a matrix form and calculate the divergence, I seem to get:

\sigma_{0}\begin{bmatrix}<br /> x_{1}^{2} &amp; x_{1}x_{2} &amp; x_{1}x_{3}\\ <br /> x_{1}x_{2} &amp; x_{2}^{2} &amp; x_{2}x_{3}\\ <br /> x_{1}x_{3} &amp; x_{2}x_{3} &amp; x_{3}^{2}<br /> \end{bmatrix}

\sigma_{ij&#039;i}=\sigma_{0}\begin{bmatrix}<br /> 2x_{1} &amp; x_{2} &amp; x_{3}\\ <br /> x_{1} &amp; 2x_{2} &amp; x_{3}\\ <br /> x_{1} &amp; x_{2} &amp; 2x_{3}<br /> \end{bmatrix}

Which doesn't equal the partial that wasn't put into matrix form. Any help?
 
Physics news on Phys.org
hi paccali! :wink:

σij'i is a vector, not a tensor …

you haven't summed over i :smile:
 
Here is the other part of the problem, and please help me out with this:
\sigma(r)=\sigma_{0}\mathbf{r}\otimes\mathbf{r} where \mathbr{r}=x_{i}i_{i}

So, would this be a correct approach?:

\bigtriangledown \cdot\sigma_{ij}=\frac{\partial }{\partial x_{k}}\sigma_{0}x_{i}x_{j}i_{i}\otimes i_{j}\cdot i_{k}
=(\sigma_{0}x_{i}x_{j})_{&#039;k}i_{i}\delta _{jk}=\sigma_{0}(x_{i}x_{j})_{&#039;j}i_{i}=\sigma_{0}x_{i}i_{i}
 
Last edited:
sorry, not my field … you'd better start a new thread on this one
 
tiny-tim said:
sorry, not my field … you'd better start a new thread on this one

Yeah, sorry, the problem is in tensor notation, which implies summation symbols, but it's a shorthand.
 

Similar threads

Replies
2
Views
2K
Replies
6
Views
3K
Replies
8
Views
1K
Replies
9
Views
1K
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K