How Do Tensor and Vector Notations Differ in Physics?

Click For Summary

Discussion Overview

The discussion revolves around the notation and conceptual understanding of tensors and vectors in physics, particularly focusing on the differences in index placement (upper and lower) and the implications of these distinctions. Participants explore the definitions, types, and applications of tensors, including specific examples like the electromagnetic tensor.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants express confusion about the notation for tensors and vectors, particularly regarding the use of upper and lower indices.
  • It is noted that "tensor" is a general term encompassing various types, with a type-##\binom n m## tensor having ##n## upper indices and ##m## lower indices; a vector is identified as a type-##\binom 1 0## tensor.
  • Participants discuss whether a (2,0) tensor is simply a matrix or if there are distinctions between a (2,0) tensor and a matrix.
  • Some contributions clarify that the rank of a tensor (number of indices) is separate from the position of the indices (upper or lower), and that both can be manipulated using a metric tensor to raise or lower indices.
  • There is mention of the specific nature of vectors and covectors, with vectors being sets of numbers that change under coordinate transformations, while covectors are linear maps that return scalars.
  • One participant questions whether all tensors of type ##(n,m)## are linear combinations of the form ##v_1 \otimes v_2 \otimes \dots \otimes v_n \otimes v^*_1 \otimes v^*_2 \otimes \dots \otimes v^*_m##.
  • Another participant emphasizes the importance of understanding tensor components with respect to a given basis and the role of the pseudo-metric in raising and lowering indices.

Areas of Agreement / Disagreement

Participants express varying levels of understanding and confusion regarding the notation and implications of tensor indices. There is no consensus on whether a (2,0) tensor is equivalent to a matrix, and the discussion remains unresolved on several points regarding the definitions and applications of tensors.

Contextual Notes

Some statements rely on specific assumptions about the metric and coordinate systems, and the discussion highlights the complexity of tensor notation without resolving these nuances.

Silviu
Messages
612
Reaction score
11
Hello. I am confused about the notation for tensors and vectors. From what I saw, for a 4-vector the notation is with upper index. But for a second rank tensor (electromagnetic tensor for example) the notation is also upper index. I attached a screenshot of this. Initially I thought that for vectors there is upper index while for tensors there is lower index but now I am really confused (see second image). What is the actual notation? Thank you!
 

Attachments

  • Screen Shot 2016-08-10 at 10.24.46 PM.png
    Screen Shot 2016-08-10 at 10.24.46 PM.png
    21.4 KB · Views: 1,076
  • Screen Shot 2016-08-10 at 10.40.30 PM.png
    Screen Shot 2016-08-10 at 10.40.30 PM.png
    52.7 KB · Views: 840
Physics news on Phys.org
"Tensor" is a general term which can come in many types, with upper indices, lower indices, or both, they're all called "tensors". A type-##\binom n m## tensor has ##n## upper indices and ##m## lower indices. A vector is a type-##\binom 1 0## tensor. A scalar is a type-##\binom 0 0## tensor.

The electromagnetic tensor in your example is a type-##\binom 2 0## tensor. There are also type-##\binom 1 1## and type-##\binom 0 2## tensors.
 
  • Like
Likes   Reactions: mohamedibr752
DrGreg said:
"Tensor" is a general term which can come in many types, with upper indices, lower indices, or both, they're all called "tensors". A type-##\binom n m## tensor has ##n## upper indices and ##m## lower indices. A vector is a type-##\binom 1 0## tensor. A scalar is a type-##\binom 0 0## tensor.

The electromagnetic tensor in your example is a type-##\binom 2 0## tensor. There are also type-##\binom 1 1## and type-##\binom 0 2## tensors.
So, if a (1 0) tensor is a vector, this means that (2 0) is just a matrix, or there is a difference between (2 0) tensor and matrix? (sorry for horizontal notation)
 
Whether the index is upper or lower is a separate question from how many indexes there are (i.e., the rank of the tensor). As DrGreg says, you can have "vectors" (objects with one index) with either an upper or a lower index, and you can have tensors (objects with two--or more--indexes) with two upper indexes, two lower indexes, or one upper and one lower index.

If you know the metric tensor, you can use it to raise or lower indexes, so in many applications in physics the indexes are simply placed wherever is most convenient. The key thing is to make sure that free indexes on both sides of an equation match, and that contractions are done using one upper and one lower index.

Strictly speaking, however, upper and lower indexes mean different things. For example, a vector (one upper index) is a set of 4 numbers (in 4-d spacetime) that change in a particular way when you change coordinates. A "covector" (one lower index) is a linear map from vectors to numbers, i.e., it is an object which, when given a vector, gives back a number (a scalar). The standard notation for this is a contraction: the number ##s## obtained when you take a vector ##V^a## and apply to it a covector (linear map) ##C_a## is ##s = V^a C_a = V^0 C_0 + V^1 C_1 + V^2 C_2 + V^3 C_3##.
 
PeterDonis said:
Whether the index is upper or lower is a separate question from how many indexes there are (i.e., the rank of the tensor). As DrGreg says, you can have "vectors" (objects with one index) with either an upper or a lower index, and you can have tensors (objects with two--or more--indexes) with two upper indexes, two lower indexes, or one upper and one lower index.

If you know the metric tensor, you can use it to raise or lower indexes, so in many applications in physics the indexes are simply placed wherever is most convenient. The key thing is to make sure that free indexes on both sides of an equation match, and that contractions are done using one upper and one lower index.

Strictly speaking, however, upper and lower indexes mean different things. For example, a vector (one upper index) is a set of 4 numbers (in 4-d spacetime) that change in a particular way when you change coordinates. A "covector" (one lower index) is a linear map from vectors to numbers, i.e., it is an object which, when given a vector, gives back a number (a scalar). The standard notation for this is a contraction: the number ##s## obtained when you take a vector ##V^a## and apply to it a covector (linear map) ##C_a## is ##s = V^a C_a = V^0 C_0 + V^1 C_1 + V^2 C_2 + V^3 C_3##.
Thank you! So, a (2 0) tensor is a normal matrix. And if we have the metric, we can turn this to a (0 2) tensor, which might not be a matrix anymore. Is this right?
 
Are ##\binom n m## tensors usually only those of the type ##v_1 \otimes v_2 \otimes \dots \otimes v_n \otimes v^*_1 \otimes v^*_2 \otimes \dots \otimes v^*_m## or linear combinations of them as well, i.e. all tensors of ##(n,m)## rank?
 
Silviu said:
So, a (2 0) tensor is a normal matrix. And if we have the metric, we can turn this to a (0 2) tensor, which might not be a matrix anymore. Is this right?

Neither a (2,0) tensor nor a (0,2) tensor is a matrix, but the components of both in a given coordinate system (##A^{ij}## and ##A_{ij}##) can be arranged in a two-dimension matrix when it is convenient - and it often is, so you see this done all the time.

If you want to get into GR as quickly as possible, https://preposterousuniverse.com/wp-content/uploads/2015/08/grtinypdf.pdf is a good practical summary. Just be warned that in this context "practical" means "ignores all mathematical niceties not required to get results out of the machinery".
 
DrGreg said:
"Tensor" is a general term which can come in many types, with upper indices, lower indices, or both, they're all called "tensors". A type-##\binom n m## tensor has ##n## upper indices and ##m## lower indices. A vector is a type-##\binom 1 0## tensor. A scalar is a type-##\binom 0 0## tensor.

The electromagnetic tensor in your example is a type-##\binom 2 0## tensor. There are also type-##\binom 1 1## and type-##\binom 0 2## tensors.
Even more careful is to stress that here we deal with tensor components with respect to a given Minkowski-orthonormal basis, i.e., you have ##\binom{n}{m}## tensor components, written as a symbol with one ##n## upper and ##m## lower indices. You can raise or lower indices with the pseudo-metric components ##\eta^{\mu \nu}## and ##\eta_{\mu \nu}## with ##(\eta_{\mu \nu})=(\eta^{\mu \nu})=\mathrm{diag}(1,-1,-1,-1)## (in the usual matrix notation for 2nd-rank tensor components).
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 9 ·
Replies
9
Views
812
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
Replies
22
Views
3K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K