Tensor summation and components.

peripatein
Messages
868
Reaction score
0
Hello,

I would very much like someone to please clarify the following points concerning tensor summation to me. Suppose the components of a tensor Ai j are A1 2 = A2 1 = A (or, in general, Axy = Ayx = A), whereas all the other components are 0. Is this a symmetrical tensor then? How may Ai j be written in the form of a matrix? Furthermore, suppose I then have the following sum:
RilRjmAlm
Do l and m run from 1 to 3? How may I actually carry out this summation, considering the above-mentioned properties of A?

Thanks!
 
Physics news on Phys.org
Yes, if ##A^{ij}=A^{ji}## for all i,j, then A is symmetric.

Recall that the definition of matrix multiplication is ##(AB)_{ij}=A_{ik} B_{kj}## (with a summation over k). So if you want to write that sum you asked about as a product of matrices, it will be ##RAR^T##, where R is the matrix with ##R^i{}j## on row i, column j, and A is the matrix with ##A^{ij}## on row i, column j.

The fact that A is symmetric doesn't simplify the problem much, unless you have used it to choose R such that ##RAR^T## is diagonal. Then that simplifies the problem by allowing you to ignore all matrix elements with i≠j (because they're all zero).
 
Would your answer still be valid if A and R were tensors? (which they are!)
 
Sure, why wouldn't it?
 
OK, your answer then brings me back to a few elementary questions (if I may):
1) Does it carry any import if the indices are contravariant or covariant (in regard to the summation formula of two matrices you wrote in your first reply)? Will it, in other words, have any effect on the formula?
2) What was the rationale behind writing the three tensors in your answer in the order they are written?
 
1. It influences the definition of the matrices, but not much else. For example, let's define ##B^{jl}=R^j{}_m A^{lm}##. What you wrote can now be written as ##R^i{}_l B^{jl}##. Suppose that we want to interpret this as row i, column j of a matrix RB, obtained by multiplying a matrix R with a matrix B. Now look at the definition of matrix multiplication. The indices that are being summed over are the column index of the matrix on the left, and the row index of the matrix on the right. So we must interpret ##l## as a column index of R, and as a row index of B. So we must define R as the matrix with ##R^i{}_j## on row i, column j, and B as the matrix with ##B^{ji}## on row i, column j. That last one looks really weird, since we're used to having the row index first. So we should do something about it. The obvious solution is to abandon the original plan to interpret what you wrote as ##(RB)^{ij}##, and instead define B as the matrix with ##B^{ij}## on row i, column j, so that we can interpret what you wrote as the row i, column j component of ##RB^T##.

2. It follows from the definition of matrix multiplication, as in the answer to 1 above.

I won't have time for followup questions for the next 10 hours or so. But maybe someone else does.
 
That all makes perfect sense now :-). Thank you very much for your kindness and insightful assistance, Fredrik!
 
A tensor can always be represented as a matrix in a given coordinate system. The distinction between a "matrix" and a "tensor" is that a tensor changes in a specific way when you change from one coordinate system to another.
 
HallsofIvy said:
A tensor can always be represented as a matrix in a given coordinate system. The distinction between a "matrix" and a "tensor" is that a tensor changes in a specific way when you change from one coordinate system to another.
Well if ##V## is a finite dimensional vector space and ##L## is a linear operator on ##V## then it indeed has a coordinate representation as a matrix. But more generally if ##T## is a tensor associated with ##V## then very loosely put its "coordinate representation" is what is known as a hypermatrix: http://galton.uchicago.edu/~lekheng/work/hla.pdf
 
  • #10
HallsofIvy said:
A tensor can always be represented as a matrix in a given coordinate system. The distinction between a "matrix" and a "tensor" is that a tensor changes in a specific way when you change from one coordinate system to another.

This is true for linear and bilinear maps, but not for n-linear if n>3 .
 
  • #11
HallsofIvy said:
A tensor can always be represented as a matrix in a given coordinate system. The distinction between a "matrix" and a "tensor" is that a tensor changes in a specific way when you change from one coordinate system to another.
I think this response should be made more precise. The components of a tensor change in a specific way when you change from one coordinate system to another. The tensor itself is independent of coordinate system.
 
  • #12
Chestermiller said:
I think this response should be made more precise. The components of a tensor change in a specific way when you change from one coordinate system to another. The tensor itself is independent of coordinate system.
Yes, thank you.
 
Back
Top