manicwhite said:
S:T=tr[STT] =tr[ST]
einstein notation
tr[SijTjk]
[SijTjk]ii
This notation doesn't make sense. (That doesn't mean that I would be surprised if your book or your professor uses it or something very similar).
This makes sense:
S:T=\operatorname{tr}(S^T T)=(S^T T)_{ii}=(S^T)_{ij}T_{ji}=S_{ji}T_{ji}
Note that I'm just using your definition of S:T, the definition of trace, the definition of matrix multiplication, and the definition of the transpose. (And obviously the summation convention too).
hunt_mat's suggestion is an easy way to solve this problem. Another approach, which may be more useful in the long run, is to prove that
a) if S is symmetric and A is antisymmetric (i.e. A
ij=-A
ji), then SA=0. (The terms "skew symmetric", "skew" or "alternating" are often used instead of "antisymmetric").
b) Every tensor (with two indices) can be expressed as a sum of a symmetric and an antisymmetric tensor.
Part a) is an easy exercise that I suggest that you do. Part b) is obvious once you've seen the trick, but it can take some time to see it:
T=\frac{T+T^T}{2}+\frac{T-T^T}{2}
As an example of how this can simplify things, consider the following proof of the identity \vec x\cdot(\vec x\times \vec y):
\vec x\cdot(\vec x\times \vec y)=x_i\varepsilon_{ijk}x_jy_k=0
The conclusion is immediate, since \varepsilon is antisymmetric in i and j (i.e. changes sign under an exchange of i and j), and x
ix
j is symmetric in i and j.