Undergrad How Do Tensor and Vector Notations Differ in Physics?

Click For Summary
The discussion clarifies the notation differences between tensors and vectors in physics, emphasizing that both can have upper and lower indices. A vector is defined as a type-(1,0) tensor, while the electromagnetic tensor is a type-(2,0) tensor. The distinction between upper and lower indices is crucial, as they represent different mathematical objects: vectors and covectors. It is noted that tensors can be represented as matrices in certain coordinate systems, but they are not inherently matrices. Understanding these notations is essential for proper application in physics, particularly in contexts like general relativity.
Silviu
Messages
612
Reaction score
11
Hello. I am confused about the notation for tensors and vectors. From what I saw, for a 4-vector the notation is with upper index. But for a second rank tensor (electromagnetic tensor for example) the notation is also upper index. I attached a screenshot of this. Initially I thought that for vectors there is upper index while for tensors there is lower index but now I am really confused (see second image). What is the actual notation? Thank you!
 

Attachments

  • Screen Shot 2016-08-10 at 10.24.46 PM.png
    Screen Shot 2016-08-10 at 10.24.46 PM.png
    21.4 KB · Views: 1,052
  • Screen Shot 2016-08-10 at 10.40.30 PM.png
    Screen Shot 2016-08-10 at 10.40.30 PM.png
    52.7 KB · Views: 824
Physics news on Phys.org
"Tensor" is a general term which can come in many types, with upper indices, lower indices, or both, they're all called "tensors". A type-##\binom n m## tensor has ##n## upper indices and ##m## lower indices. A vector is a type-##\binom 1 0## tensor. A scalar is a type-##\binom 0 0## tensor.

The electromagnetic tensor in your example is a type-##\binom 2 0## tensor. There are also type-##\binom 1 1## and type-##\binom 0 2## tensors.
 
  • Like
Likes mohamedibr752
DrGreg said:
"Tensor" is a general term which can come in many types, with upper indices, lower indices, or both, they're all called "tensors". A type-##\binom n m## tensor has ##n## upper indices and ##m## lower indices. A vector is a type-##\binom 1 0## tensor. A scalar is a type-##\binom 0 0## tensor.

The electromagnetic tensor in your example is a type-##\binom 2 0## tensor. There are also type-##\binom 1 1## and type-##\binom 0 2## tensors.
So, if a (1 0) tensor is a vector, this means that (2 0) is just a matrix, or there is a difference between (2 0) tensor and matrix? (sorry for horizontal notation)
 
Whether the index is upper or lower is a separate question from how many indexes there are (i.e., the rank of the tensor). As DrGreg says, you can have "vectors" (objects with one index) with either an upper or a lower index, and you can have tensors (objects with two--or more--indexes) with two upper indexes, two lower indexes, or one upper and one lower index.

If you know the metric tensor, you can use it to raise or lower indexes, so in many applications in physics the indexes are simply placed wherever is most convenient. The key thing is to make sure that free indexes on both sides of an equation match, and that contractions are done using one upper and one lower index.

Strictly speaking, however, upper and lower indexes mean different things. For example, a vector (one upper index) is a set of 4 numbers (in 4-d spacetime) that change in a particular way when you change coordinates. A "covector" (one lower index) is a linear map from vectors to numbers, i.e., it is an object which, when given a vector, gives back a number (a scalar). The standard notation for this is a contraction: the number ##s## obtained when you take a vector ##V^a## and apply to it a covector (linear map) ##C_a## is ##s = V^a C_a = V^0 C_0 + V^1 C_1 + V^2 C_2 + V^3 C_3##.
 
PeterDonis said:
Whether the index is upper or lower is a separate question from how many indexes there are (i.e., the rank of the tensor). As DrGreg says, you can have "vectors" (objects with one index) with either an upper or a lower index, and you can have tensors (objects with two--or more--indexes) with two upper indexes, two lower indexes, or one upper and one lower index.

If you know the metric tensor, you can use it to raise or lower indexes, so in many applications in physics the indexes are simply placed wherever is most convenient. The key thing is to make sure that free indexes on both sides of an equation match, and that contractions are done using one upper and one lower index.

Strictly speaking, however, upper and lower indexes mean different things. For example, a vector (one upper index) is a set of 4 numbers (in 4-d spacetime) that change in a particular way when you change coordinates. A "covector" (one lower index) is a linear map from vectors to numbers, i.e., it is an object which, when given a vector, gives back a number (a scalar). The standard notation for this is a contraction: the number ##s## obtained when you take a vector ##V^a## and apply to it a covector (linear map) ##C_a## is ##s = V^a C_a = V^0 C_0 + V^1 C_1 + V^2 C_2 + V^3 C_3##.
Thank you! So, a (2 0) tensor is a normal matrix. And if we have the metric, we can turn this to a (0 2) tensor, which might not be a matrix anymore. Is this right?
 
Are ##\binom n m## tensors usually only those of the type ##v_1 \otimes v_2 \otimes \dots \otimes v_n \otimes v^*_1 \otimes v^*_2 \otimes \dots \otimes v^*_m## or linear combinations of them as well, i.e. all tensors of ##(n,m)## rank?
 
Silviu said:
So, a (2 0) tensor is a normal matrix. And if we have the metric, we can turn this to a (0 2) tensor, which might not be a matrix anymore. Is this right?

Neither a (2,0) tensor nor a (0,2) tensor is a matrix, but the components of both in a given coordinate system (##A^{ij}## and ##A_{ij}##) can be arranged in a two-dimension matrix when it is convenient - and it often is, so you see this done all the time.

If you want to get into GR as quickly as possible, https://preposterousuniverse.com/wp-content/uploads/2015/08/grtinypdf.pdf is a good practical summary. Just be warned that in this context "practical" means "ignores all mathematical niceties not required to get results out of the machinery".
 
DrGreg said:
"Tensor" is a general term which can come in many types, with upper indices, lower indices, or both, they're all called "tensors". A type-##\binom n m## tensor has ##n## upper indices and ##m## lower indices. A vector is a type-##\binom 1 0## tensor. A scalar is a type-##\binom 0 0## tensor.

The electromagnetic tensor in your example is a type-##\binom 2 0## tensor. There are also type-##\binom 1 1## and type-##\binom 0 2## tensors.
Even more careful is to stress that here we deal with tensor components with respect to a given Minkowski-orthonormal basis, i.e., you have ##\binom{n}{m}## tensor components, written as a symbol with one ##n## upper and ##m## lower indices. You can raise or lower indices with the pseudo-metric components ##\eta^{\mu \nu}## and ##\eta_{\mu \nu}## with ##(\eta_{\mu \nu})=(\eta^{\mu \nu})=\mathrm{diag}(1,-1,-1,-1)## (in the usual matrix notation for 2nd-rank tensor components).
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
726
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 6 ·
Replies
6
Views
816
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K