Question about tensor notation convention as used in SR/GR

In summary, this text introduces a different summation convention for tensors, but it does not always hold. This can lead to confusion.
  • #1
Coffee_
259
2
When writing

##A_{a}\text{ }^{b}## one means ''The element on the a-th row and b-th column of the TRANSPOSE of A" right?

Such that ##A_{a}\text{ } ^{b}= A^{b}\text{ } _{a}## ?

I would just like a confirmation so I'm not learning the convention in a wrong manner.
 
Physics news on Phys.org
  • #2
Coffee_ said:
Such that ##A_{a}\text{ } ^{b}= A^{b}\text{ } _{a}## ?

This is not at all certain and depends on the symmetry properties of ##A##. You seem to be assuming that ##A## is a matrix, but generally it is simply a tensor or a set of numbers with indices attached. Now, this may be represented with a matrix (if it only has two indices), but you generally need to be careful with your definitions when doing so.
 
  • #3
If you understand the summation convention,[tex]
A_{a}{ } ^{b}= g_{ac} \, g^{bd}A^{c}{ } _{d}
[/tex]In matrix notation, the right-hand side is [itex]\textbf{gAg}^{-1}[/itex], where [itex]\textbf{A}[/itex] denotes the matrix whose (a,b) component is [itex]A^{a}{ } _{b}[/itex].
 
  • #4
That definition seems to imply that it's not really always equal to the transposed matrix. So in what cases is it alright to treat A as a matrix and just say that ##A^{a} \text{ } _{b}=A_{b}\text{ } ^{a}##?
 
  • #5
You must verify the conventions used in the text or paper.
In some cases, those are abstract indices (labels for slots) -- not components and not something to be "summed over".
Also, note that in tensor notation, unlike in a sequence of compatible matrix multiplications,
there is no specific ordering of factors (without changing the index structure of each factor).
That is to say, ##A^aB_{ab}C^{bc}D^{q}{}_{c}=C^{bc}A^aD^{q}{}_{c}B_{ab}## represent the same vector (call it ##V^q##).
 
  • #6
Coffee_ said:
That definition seems to imply that it's not really always equal to the transposed matrix. So in what cases is it alright to treat A as a matrix and just say that ##A^{a} \text{ } _{b}=A_{b}\text{ } ^{a}##?

If you lower the index a on both sides you get [itex]A_{ab}=A_{ba}[/itex], which is a statement that A is symmetric. So your statement is true if and only if A is symmetric.

You might find it helpful to write down an asymmetric 2x2 matrix ##A^{a} \text{ } _{b}##, let the metric be diag(1,-1), and work out how ##A_{b}\text{ } ^{a}## differs from it.
 
  • #7
I see, according to all of your answers the convention we saw in class is not entirely correct since we used ##A_{i} \text{ } ^{j} ## to represent the i-th row and the j-th column of the transposed matrix. What if I'd note this as ##(A ^{T}) _{i} \text{ } ^{j}## is it correct then or does it have to be ##(A ^{T})^{i} \text{ } _{j}##? That is, to represent the i-th row and j-th column of the transpose of A.
 
  • #9
Index notation is a complete and self-contained alternative to matrix notation, not an alternative representation of it. There is seldom any reason to translate back and forth between the two, and in many cases (e.g., a tensor with more than 2 indices) no such translation is even possible. The most common examples I've seen where relativists do want to revert to matrix notation is to represent either a metric tensor or a stress-energy tensor compactly by writing it in matrix form. Since these are symmetric, it doesn't matter what convention you use for matching up the indices with rows and columns. I think your instructor is just trying to connect tensors to familiar ideas about matrices.

Notations like ##(A ^{T}) _{i} \text{ } ^{j}## are not used in relativity, in my experience. The reason is that the T is not needed, because we can represent the same idea more efficiently just by moving indices around.

If you really want to get fluent at converting back and forth between the two notations, then first off there is a convention that contravariant (upper-index) vectors are column vectors, while covectors are row vectors. The grammar of index notation says that when indices are being summed over, one must be an upper index and one a lower index. This means that:

1. A linear transformation that takes a contravariant vector as an input and gives a contravariant vector as an output must have one lower index and one upper index.
2. A transformation from contravariant vectors to covariant ones must have two lower indices.
3. A transformation from covariant vectors to contrariant ones must have two upper indices.
4. A transformation from covariant vectors to covariant ones must have one upper and one lower index.

Let's consider a case where our matrix isn't square, say [itex]A_{ab}[/itex], where a is an index in a three-dimensional space and b is an index in a two-dimensional space. In a situation like this, we want to make sure that when A is translated into matrix language, it doesn't make sense to add A to its own transpose -- you can't add a 3x2 matrix and a 2x3. This all comes out properly if we interpret the transpose operation as flipping the order of the indices to [itex]A_{ba}[/itex]. Then, as expected, we can't have [itex]A_{ab}+A_{ba}[/itex], because the two terms belong to different spaces.

The convention used in your instructor's eqs. 2.19-20, where they have a mixed (upper-lower) rank-2 tensor, also makes sense according to this logic. They're associating transposition with reversing the two indices. This will in general produce a tensor that belongs to a different space. In their example, they have a tensor that belongs to the upper-lower space (first index is upper, second lower), and when they transpose they get one that belongs to the lower-upper space.

Coffee_ said:
I see, according to all of your answers the convention we saw in class is not entirely correct since we used ##A_{i} \text{ } ^{j} ## to represent the i-th row and the j-th column of the transposed matrix.

I'm not perceiving the same problem that you are. As far as I can tell, everything is consistent.

Maybe what's bothering you is that you want there to be a convention that will tell you whether a given NxN matrix should be associated with an upper-lower tensor like ##A^{a} \text{ } _{b} ##, or a lower-upper one like ##A_{b} \text{ } ^{a} ##. There can't be any such rule. When we write something that looks like a square matrix, and then write its transpose, it still looks like a square matrix. Looking at the two matrices, there would be no way to tell which should be ##A^{a} \text{ } _{b} ## and which should be ##A_{b} \text{ } ^{a} ##. This is just an ambiguity in matrix notation, and it can't be resolved without adding some external information. If I just write you an NxN matrix, you can't tell what space it belongs to. It could belong to any of the following spaces: upper-upper, upper-lower, lower-upper, or lower-lower.
 
Last edited:
  • Like
Likes Coffee_
  • #10
First of all, thanks for the very elaborate answer.

Secondly I'd like to go in on one paragraph.

bcrowell said:
The convention used in your instructor's eqs. 2.19-20, where they have a mixed (upper-lower) rank-2 tensor, also makes sense according to this logic. They're associating transposition with reversing the two indices. This will in general produce a tensor that belongs to a different space. In their example, they have a tensor that belongs to the upper-lower space (first index is upper, second lower), and when they transpose they get one that belongs to the lower-upper space.

If the convention used here, is the correct way to represent the transposed, then clearly my intuition was correct in my original post that ##A_{i} \text{ } ^{j}=A^{j} \text{ } _{i}##. Element wise, those two should be equal. Because the first term is the element of the i-th row and j-th column of the A matrix, which should be equal to the element of the j-th row and i-th column of the transposed matrix which is exactly the second term no? The ''tensors/matrices'' aren't equal , but for a fixed i and j , for example 2 and 3 , this equality must hold it seems. At least that is, if I understood the convention implied by the author here.

If the above is correct, it seems that the author seems to contradict his own convention in eq 2.22 on page 13.
 
  • #11
Coffee_ said:
If the convention used here, is the correct way to represent the transposed, then clearly my intuition was correct in my original post that ##A_{i} \text{ } ^{j}=A^{j} \text{ } _{i}##.
Not true, as proved in #6.
Coffee_ said:
Element wise, those two should be equal. Because the first term is the element of the i-th row and j-th column of the A matrix, which should be equal to the element of the j-th row and i-th column of the transposed matrix which is exactly the second term no?

No. As explained in #9, there cannot be any fixed rule for translating back and forth between matrix notation and tensor notation. The fact that you've reached a contradiction here is an indication that you were incorrect in assuming the existence of such a rule.
 
  • #12
Coffee_ said:
If the convention used here, is the correct way to represent the transposed, then clearly my intuition was correct in my original post that ##A_{i} \text{ } ^{j}=A^{j} \text{ } _{i}##.
Using your author's notation, (2.20) should be interpreted to mean$$(A^T)_{i} {}^{j}=A^{j} {}_{i} \, .$$What you wrote would be true only if A were symmetric.
 
  • #13
DrGreg said:
Using your author's notation, (2.20) should be interpreted to mean$$(A^T)_{i} {}^{j}=A^{j} {}_{i} \, .$$What you wrote would be true only if A were symmetric.

Oh thanks this really clears up a lot. Can't believe how long it took me to get this simple fact. I think I understand it now.
 

1. What is tensor notation convention?

Tensor notation convention is a specialized mathematical notation used to represent tensors, which are mathematical objects that describe the relationships between different quantities. It is commonly used in the fields of special and general relativity to represent the geometric properties of spacetime.

2. How is tensor notation convention used in special relativity (SR)?

In special relativity, tensor notation convention is used to represent the spacetime metric, which describes the curvature of spacetime. It is also used to represent other physical quantities, such as energy, momentum, and force, in a way that is invariant under Lorentz transformations.

3. What is the difference between tensor notation convention in SR and GR?

The main difference between tensor notation convention in special relativity and general relativity is the use of different coordinate systems. In special relativity, Minkowski coordinates are used, while in general relativity, curved coordinates (such as spherical or Cartesian coordinates) are used to represent the curvature of spacetime.

4. Why is tensor notation convention important in the study of relativity?

Tensor notation convention is important in the study of relativity because it allows for a concise and consistent way of representing the complex geometric relationships between different physical quantities in spacetime. It also ensures that these relationships are invariant under different coordinate transformations, making it a powerful tool for understanding the principles of special and general relativity.

5. Are there different types of tensor notation convention?

Yes, there are different types of tensor notation convention, such as index notation (also known as Einstein notation) and abstract index notation. These different notations have their own rules and conventions, but they all serve the same purpose of representing tensors in a consistent and concise way.

Similar threads

  • Special and General Relativity
Replies
10
Views
2K
  • Special and General Relativity
Replies
22
Views
2K
  • Special and General Relativity
Replies
1
Views
543
  • Special and General Relativity
Replies
27
Views
2K
  • Special and General Relativity
Replies
9
Views
3K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
926
  • Special and General Relativity
Replies
5
Views
2K
Back
Top