1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Tensor Field Commutativity

  1. Jan 22, 2010 #1
    1. The problem statement, all variables and given/known data
    The SR&GR guy aren't being very helpful, so maybe I can get this quickly resolved here.

    I want to know if the product of two tensor fields is generally non-commutative. That is, if I have two tensor fields [itex] A_{ij}, B_k^\ell [/itex] do these representations commute?

    3. The attempt at a solution

    I feel generally quite conflicted about this subject, and I think it's because I don't fully understand what the representations mean. On one hand, I want to say that for a fixed i,j these simply represent scalar elements and so certainly commute. However, taken as general tensors (for example matrices), they would not commute. That is, if A and B were matrices, then [itex] AB \neq BA[/itex] in general, but given the representations [itex] (A)_{ij} = a_{ij}, (B)_{ij} = b_{ij} [/itex] then certainly [itex] a_{ij}b_{k\ell} = b_{k\ell} a_{ij} [/itex] - the only "non-commutativity" comes in the ordering of the indices. Can anybody shed some light on this situation?
     
  2. jcsd
  3. Jan 22, 2010 #2
    If [tex]A[/tex] and [tex]B[/tex] are two [tex](0, 2)[/tex] tensors with components [tex]a_{ij}[/tex] and [tex]b_{ij}[/tex] respectively, then the [tex](0, 4)[/tex] tensor with components [tex]c_{ijkl} = a_{ij} b_{kl}[/tex] is the tensor product [tex]A \otimes B[/tex]. You are correct to observe that this tensor differs from [tex]B \otimes A[/tex], which has components [tex]c'_{ijkl} = b_{ij} a_{kl}[/tex], only in the ordering of the indices -- but since the order of the indices for tensors is very important, you can't think of [tex]\otimes[/tex] as a commutative product.

    When you mention matrices, you are talking about a different product, which is noncommutative in a more fundamental way. In their role as linear transformations, you can think of matrices as [tex](1, 1)[/tex] tensors. If [tex]A[/tex] and [tex]B[/tex] are matrices expressed this way, with components [tex]a^i_j[/tex] and [tex]b^i_j[/tex], then the matrix product [tex]AB[/tex] (composition of linear transformations) has components [tex]c^i_j = \sum_k a^i_k b^k_j[/tex], or simply [tex]a^i_k b^k_j[/tex] in the Einstein summation convention. In tensor language, the matrix product (composition) is actually reflected as a contraction (which is also how the product of two [tex](1, 1)[/tex] tensors can be another [tex](1, 1)[/tex] tensor and not a [tex](2, 2)[/tex] tensor, as the tensor product would be).
     
  4. Jan 23, 2010 #3
    Some prefer to use (1,1) tensors as matrices and some say that (0,2) and (2,0) tensors are to be called the second-rank matrices which of course sounds so correct. The reason is that 4-by-4 matrices (on a 4d spacetime) are mixed tensors that are not that much identified among physicists and in their language you can find a lot of things like a metric tensor is a second-rank square matrix and if this is the case, then claiming mixed tensors as being of the nature of the same matrices does seem absurd. Besides, if we represent [tex]v^i[/tex] (i=0,...,3) as a 1-by-4 matrix (i.e. a row-vector) and a mixed tensor as a 4-by-4 matrix, then from the transformation formula

    [tex]v^i=\frac{\partial x^i}{\partial \bar{x}^j}\bar{v}^j[/tex]

    one would expect to have a [tex](4\times 4)(1\times 4) = (4\times 4)[/tex] vector which is absurd whereas if the transformation formula was written as

    [tex]v^i=\bar{v}^j\frac{\partial x^i}{\partial \bar{x}^j}[/tex],

    everything would be okay. The same situation happens to exist when one wants to take an upper index down or vice versa using the metric matrix [tex]g_{ij}[/tex], i.e.

    [tex]v_i=g_{ij}v^j[/tex],

    then taking the preceding path, a 4-by-4 matrix is to be assigned to a vector vi!!

    So to simply answer our OP's question about why non-commutativity does not hold for componential representation of matrices\tensors, I gotta say that you can readily change the position of two numbers under the usual operation of multiplication, while you can't do the same stuff to an arrangement of numbers with a different law of multiplication which is, deep down, not commutative. So we you deal with tensors (actually with rank 2) as matrices, or a complex of matrices and tensors together, like [tex]Ag_{ab}C[/tex] where A and C are 4-by-4 matrices and [tex]g_{ab}[/tex] is the second-rank metric tensor, then non-commutativity is to be strongly considered in our calculations.

    And I assume that you know for second-rank tensors, tensor rank and matrix rank are the same. So these things won't work if we are given something like a (0,3) tensor.

    AB
     
    Last edited: Jan 23, 2010
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook