Commutativity of Tensor Field Multiplication

In summary: The issue isn't that there's a summation. Your example includes a summation (an implied Einstein summation), and is an equality. My example in #2 includes a summation, and is an inequality. The issue is that you didn't rearrange the indices in the way you'd have to in order to represent something like commutation of matrix multiplication.
  • #1
Kreizhn
743
1
It may seem like a very simple question, but I just want to clarify something:

Is tensor field multiplication non-commutative in general?

For example, if I have two tensors [itex] A_{ij}, B_k^\ell [/itex] then in general, is it true that

[tex] A_{ij} B_k^\ell \neq B_k^\ell A_{ij} [/tex]

I remember them being non-commutative, but I want to make sure.
 
Last edited:
Physics news on Phys.org
  • #2
The example you wrote down should have an equals sign. Just write out the sum without the Einstein summation convention; all you're doing is reversing the order of two real numbers being multiplied in each term.

On the other hand, [itex]A_i^j B_{jk} \ne B_i^j A_{jk}[/itex], because, e.g., in a metric with +++ signature, this would just be a way of writing ordinary matrix multiplication, which is noncommutative.

Another example would be that the covariant derivative acts like a tensor, in the sense that you can raise and lower indices on it, but covariant derivatives don't commute with each other -- their commutator is the Riemann tensor.
 
  • #3
Yes, in general tensor multiplication is non-commutative. Matrix multiplication is an example.
 
Last edited:
  • #4
This is what I thought, though I would like to clarify a bit.

So for fixed indices, if the (mathematical) field commutes then the product as I've written it commutes since these are just field elements. However, the moment we introduce a summation we cannot guarantee commutativity? Scalars being the exception.
 
  • #5
bapowell said:
Yes, in general tensor multiplication is non-commutative. Matrix multiplication is an example (take j = l in your equation).

This is incorrect. His example is not an example of reversing the order of multiplication of two matrices. See my #2. If all you do is reverse the order of the two factors, written in Einstein summation convention, that isn't the same as reversing the order of multiplication of two matrices; you have to change the arrangement of the indices with respect to the two tensors, or else you're just writing another expression that's equivalent to the original expression.
 
  • #6
So the noncommutativity really arises from the indices, not the order of the representations.
 
  • #7
Kreizhn said:
This is what I thought, though I would like to clarify a bit.

So for fixed indices, if the (mathematical) field commutes then the product as I've written it commutes since these are just field elements. However, the moment we introduce a summation we cannot guarantee commutativity? Scalars being the exception.

The issue isn't that there's a summation. Your example includes a summation (an implied Einstein summation), and is an equality. My example in #2 includes a summation, and is an inequality. The issue is that you didn't rearrange the indices in the way you'd have to in order to represent something like commutation of matrix multiplication.
 
  • #8
Kreizhn said:
So the noncommutativity really arises from the indices, not the order of the representations.

Right :-)
 
  • #9
bcrowell said:
This is incorrect. His example is not an example of reversing the order of multiplication of two matrices. See my #2. If all you do is reverse the order of the two factors, written in Einstein summation convention, that isn't the same as reversing the order of multiplication of two matrices; you have to change the arrangement of the indices with respect to the two tensors, or else you're just writing another expression that's equivalent to the original expression.

Agreed. I didn't look closely enough at his example. Tried to edit my post but, alas, too late! Apologies.
 
  • #10
There seems to be some confusion in thread, so I am going to try to contribute further confusion.
Kreizhn said:
It may seem like a very simple question, but I just want to clarify something:

Is tensor field multiplication non-commutative in general?

For example, if I have two tensors [itex] A_{ij}, B_k^\ell [/itex] then in general, is it true that

[tex] A_{ij} B_k^\ell \neq B_k^\ell A_{ij} [/tex]

I remember them being non-commutative, but I want to make sure.
bapowell said:
Yes, in general tensor multiplication is non-commutative. Matrix multiplication is an example.

Suitably interpreted, the answer to the question "Is tensor multiplication commutative?" is "No." , and this agrees with everything that bcrowell wrote.

I think (but I could be wrong, and apologies if so) that Kreizhn and bapowell mean "tensor product" when they write "tensor multiplication," and the tensor product of two tensors is non-commutative, that is, if [itex]\mathbf{A}[/itex] and [itex]\mathbf{B}[/itex] are two tensors, then it is not generally true that [itex]\mathbf{A} \otimes \mathbf{B} = \mathbf{B} \otimes \mathbf{A}[/itex].

Consider a simpler example. Let [itex]V[/itex] be a finite-dimensional vector space, and let [itex]\mathbf{u}[/itex] and [itex]\mathbf{v}[/itex] both be non-zero vectors in [itex]V[/itex]. Form the tensor product space [itex]V \otimes V[/itex]. To see when

[tex]0 = \mathbf{u} \otimes \mathbf{v} - \mathbf{v} \otimes \mathbf{u},[/tex]

introduce a basis [itex]\left\{ \mathbf{e}_i \right\}[/itex] for [itex]V[/itex] so that [itex]\left\{ \mathbf{e}_i \otimes \mathbf{e}_j \right\}[/itex] is a basis for [itex]V \otimes V[/itex]. Then,

[tex]
\begin{equation*}
\begin{split}
0 &= \mathbf{u} \otimes \mathbf{v} - \mathbf{v} \otimes \mathbf{u} \\
&= \left(u^i v^j - u^j v^i \right) \mathbf{e}_i \otimes \mathbf{e}_j .
\end{split}
\end{equation*}
[/tex]

Because the basis elements are linearly independent,

[tex]u^i v^j = u^j v^i[/tex]

for all possible [itex]i[/itex] and [itex]j[/itex]. WLOG, assume that all the components of [itex]\mathbf{u}[/itex] are non-zero. Consequently,

[tex]\frac{v^j}{u^j} = \frac{v^i}{u^i}[/tex]

(no sum) for all possible [itex]i[/itex] and [itex]j[/itex], i.e., [itex]\mathbf{u}[/itex] and [itex]\mathbf{v}[/itex] are parallel.

Thus, if non-zero [itex]\mathbf{u}[/itex] and [itex]\mathbf{v}[/itex] are not parallel,

[tex]\mathbf{u} \otimes \mathbf{v} \ne \mathbf{v} \otimes \mathbf{u}.[/tex]

In component form, this reads

[tex]u^i v^j \ne u^j v^i[/tex]

for some [itex]i[/itex] and [itex]j[/itex]. As bcrowell emphasized, placement of indices is crucial.

In the original post, I think (again, I could be wrong) that Kreizhn was trying to formulate the property of non-commutativity of tensor products in the abstract-index approach advocated by, for example, Penrose and Wald. In this approach, indices do *not* refer to components with respect to a basis (no basis is chosen) and indices do *not* take on numerical values (like 0, 1, 2, 3), indices pick out copies of the vector space [itex]V[/itex]. The index [itex]i[/itex] on [itex]v^i[/itex] indicates the copy of [itex]V[/itex] in which [itex]v^i[/itex] resides. Vectors [itex]v^i[/itex] and [itex]v^j[/itex] live in different copies of [itex]V[/itex]. Vectors [itex]v^i[/itex] and [itex]u^i[/itex] live in the same copy of [itex]V[/itex].

In the component approach, [itex]v^i u^j = u^j v^i[/itex] because multiplication of real numbers is commutative. In the abstract-index approach, [itex]v^i u^j = u^j v^i[/itex] because on each side [itex]v^i[/itex] lives in the same copy of [itex]V[/itex], and on each side [itex]u^j[/itex] lives in the same (different) copy of [itex]V[/itex].

In the abstract index approach, non-commutativity of tensor products is indicated by, for example, [itex]v^i u^j \ne v^i u^j[/itex].
 
  • #11
George Jones said:
In the original post, I think (again, I could be wrong) that Kreizhn was trying to formulate the property of non-commutativity of tensor products in the abstract-index approach advocated by, for example, Penrose and Wald. In this approach, indices do *not* refer to components with respect to a basis (no basis is chosen) and indices do *not* take on numerical values (like 0, 1, 2, 3), indices pick out copies of the vector space [itex]V[/itex].

I'm not sure that it really matters which way you interpret Kreizhn's original post. Let's say he wrote down the conjecture

[tex] A_{ij} B_k^\ell \neq B_k^\ell A_{ij} [/tex]

with abstract index notation in mind. Then one way to test the conjecture is like this. We know by the definition of manifolds that the manifold is locally compatible with coordinate systems, so since coordinate systems exist, let's arbitrarily fix one. Rewrite the equation with Greek indices to show that they refer to these coordinates, rather than being abstract indices.

[tex] A_{\mu \nu} B_\kappa^\lambda \neq B_\kappa^\lambda A_{\mu \nu} [/tex]

By the axioms of the real numbers we can see that this is actually an equality, not an inequality. Since the equality held regardless of any assumption about which particular coordinates we chose, it follows that the original inequality, in abstract index notation, should also be an equality.

In other words, the rules of tensor gymnastics don't change just because you're using abstract index notation.

George Jones said:
In the component approach, [itex]v^i u^j = u^j v^i[/itex] because multiplication of real numbers is commutative. In the abstract-index approach, [itex]v^i u^j = u^j v^i[/itex] because on each side [itex]v^i[/itex] lives in the same copy of [itex]V[/itex], and on each side [itex]u^j[/itex] lives in the same (different) copy of [itex]V[/itex].

This seems fine to me, but I would emphasize that, as in the example I gave above, you don't need to forswear the manipulation of symbols according to the ordinary axioms of the real number system just because you're using abstract index notation. All you have to forswear is invocation of any special properties of a particular set of coordinates.

George Jones said:
In the abstract index approach, non-commutativity of tensor products is indicated by, for example, [itex]v^i u^j \ne v^i u^j[/itex].

Here you've really lost me. I think this must just be a typo or something, because both sides of the inequality are written using identical symbols, so it must be an equality.
 
  • #12
In another thread, Kreizhn again brings up the same question that there I answered fairly clearly the whole thing:

Some prefer to use (1,1) tensors as matrices and some say that (0,2) and (2,0) tensors are to be called the second-rank matrices which of course sounds so correct. The reason is that 4-by-4 matrices (on a 4d spacetime) are mixed tensors that are not that much identified among physicists and in their language you can find a lot of things like a metric tensor is a second-rank square matrix and if this is the case, then claiming mixed tensors as being of the nature of the same matrices does seem absurd. Besides, if we represent [tex]v^i[/tex] (i=0,...,3) as a 1-by-4 matrix (i.e. a row-vector) and a mixed tensor as a 4-by-4 matrix, then from the transformation formula

[tex]v^i=\frac{\partial x^i}{\partial \bar{x}^j}\bar{v}^j[/tex]

one would expect to have a [tex](4\times 4)(1\times 4) = (4\times 4)[/tex] vector which is absurd whereas if the transformation formula was written as

[tex]v^i=\bar{v}^j\frac{\partial x^i}{\partial \bar{x}^j}[/tex],

everything would be okay. The same situation happens to exist when one wants to take an upper index down or vice versa using the metric matrix [tex]g_{ij}[/tex], i.e.

[tex]v_i=g_{ij}v^j[/tex],

then taking the preceding path, a 4-by-4 matrix is to be assigned to a vector vi!

So to simply answer our OP's question about why non-commutativity does not hold for componential representation of matrices\tensors, I got to say that you can readily change the position of two numbers under the usual operation of multiplication, while you can't do the same stuff to an arrangement of numbers with a different law of multiplication which is, deep down, not commutative. So when you deal with tensors (actually with rank 2) as matrices, or a complex of matrices and tensors together, like [tex]Ag_{ab}C[/tex] where A and C are 4-by-4 matrices and [tex]g_{ab}[/tex] is the second-rank metric tensor, then non-commutativity is to be strongly considered in our calculations.

And I assume that you know for second-rank tensors, tensor rank and matrix rank are the same. So these things won't work if we are given something like a (0,3) tensor.

But now I want to wash some confusions people have made here from my own point of view.


bcrowell says

Yes, in general tensor multiplication is non-commutative. Matrix multiplication is an example.

and Goerge does confirm this answer by saying

Suitably interpreted, the answer to the question "Is tensor multiplication commutative?" is "No.

Rest assured that both are correct. I know where exactly the question arises from. I think Kreizhn is active in some zone of Physics that deals with tensors in Physicist's standpoint. I'm saying this because they usually don't clarify what based on their mathematical approaches are and maybe they are afraid of using symbols like [tex]\otimes[/tex]. I mean that one can picture two senarios in his mind as soon as something like [tex]A_{ij} B_k^\ell[/tex] is seen for the first time in a textbook: First, one can assume that this is a tensor product which is really possible to come to your mind even if you are a very professional expert. Because we use symbols like [tex]g_{ab}[/tex] as a second-rank 4-by-4 matrix --basically metric tensor--, for instance, in GR while it is indicative of components of an unseen matrix so I could expect [tex]g_{ab}v^a[/tex] to be taken as a tensor product by someone and here va is a 4-by-1 tensor (matrix). Speaking of which, the scenario is somehow logical and what the OP is concerned about its truth now proceeds to be completely rational that [tex] A_{ij} B_k^\ell \neq B_k^\ell A_{ij} [/tex].

Remember that to avoid confusions, mathematicians use bold-faced Latin alphabet to show a second-rank tensor or matrix, for example, so they are not worried about anything coming out of the use of componential representation of multiplication of matrices [tex]g_{ab}v^a[/tex]. Yes, this is the second scenario that [tex]g_{ab}v^a[/tex] is a componential representation of matrix multiplication or, rationally, it is the componential multiplication translated into simple words as the usual multiplication and as I quoted above from my own post, the componential multiplication is commutative. But this scenario besides being logical, is TRUE and this is what makes it distinctive from the first one.

And the last thing to be recalled is that in the componential approach one CAN consider the multiplication as being non-commutative but this requires some huge observations that all multiplications must be meant to be of the nature of matrix multiplication.

AB
 
Last edited:

1. What is the commutativity property of tensor field multiplication?

The commutativity property of tensor field multiplication states that the order in which two tensor fields are multiplied does not affect the result. In other words, the product of two tensor fields is the same regardless of which field is multiplied first.

2. How is the commutativity property useful in scientific research?

The commutativity property of tensor field multiplication is useful in various fields of science, such as physics, engineering, and computer science. It allows for simplification of calculations and equations, making it easier to analyze and understand complex systems and phenomena.

3. Can the commutativity property be applied to all types of tensor fields?

Yes, the commutativity property applies to all types of tensor fields, including scalar, vector, and tensor fields of higher order. This property is a fundamental concept in tensor algebra and is valid in both Euclidean and non-Euclidean spaces.

4. Is the commutativity property limited to only two tensor fields?

No, the commutativity property can be extended to any number of tensor fields. This means that the order of multiplication can be changed for multiple tensor fields without affecting the final result.

5. Can the commutativity property be proven mathematically?

Yes, the commutativity property can be proven using mathematical techniques such as tensor notation and transformation rules. This property is a fundamental property of tensors and is an important concept in tensor calculus.

Similar threads

  • Special and General Relativity
Replies
22
Views
2K
  • Special and General Relativity
Replies
19
Views
320
  • Special and General Relativity
Replies
9
Views
3K
Replies
40
Views
2K
  • Special and General Relativity
Replies
14
Views
1K
  • Special and General Relativity
Replies
18
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
977
  • Special and General Relativity
Replies
2
Views
1K
  • Special and General Relativity
Replies
1
Views
670
Replies
27
Views
939
Back
Top