The Order and Valence of Tensors

In summary, a tensor is a mathematical object that describes the relationship between two or more vectors. A (0,1) tensor has one vector argument and a (1,0) tensor has one dual vector argument. Additionally, the total order of the tensor is the sum of the number of the arguments, i.e. a (1,1) tensor has order 2. The metric tensor, which is a tensor with valence (0,2), takes two vector arguments and provides the scalar product of the two of them.
  • #1
WelshieTheWhite
3
0
I'm having a bit of trouble understanding the nature of tensors (which is pretty central to the gen rel course I'm currently taking).

I understand that the order (or rank) of a tensor is the dimensionality of the array required to describe it's components, i.e. a 0 rank tensor is a scalar, a 1 rank tensor is a vector (so just a one dimensional array or "line" of components), a 2 rank tensor is a matrix (2 dimensional array) etc.

Additionally, I understand the valence of a tensor depends on how many covariant and contravariant vector arguments it has (old terminology, I believe new terminology is vector and dual-vector), i.e. a (0,1) tensor has one vector argument and a (1,0) tensor has one dual vector argument. Also, the total order of the tensor is the sum of the number of the arguments, i.e. a (1,1) tensor has order 2.

Now take the metric tensor, which is a tensor with valence (0,2). It takes two vector arguments and provides the scalar product of the two of them.

[tex]g(\vec{A},\vec{B}) = \vec{A}\cdot\vec{B} = A^αB^βη_{αβ}[/tex]

Where the summation convention is implied. This is where I start getting confused. Based on the previous logic, the metric tensor should be a tensor of order 2, and hence be a matrix, but it returns a scalar result which is order 0. I understand that η represents the components of the metric tensor in a matrix form (in spacetime):

[tex]η_{αβ} = \begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}[/tex]

Which IS of order 2. Does this mean that an order 2 tensor doesn't necessarily have to be a matrix itself, but it's components do? And does that mean that all tensors are scalars with appropriate components, or can an order 2 tensor potentially be a matrix as well as its components?

Sorry if this is a bit all over the place. I think one source of my confusion comes from this passage in the textbook "A First Course in General Relativity (2nd Edition)" by Schutz:

"A tensor of type (0,N) is a function of N vectors into the real numbers, which is linear in each of its N arguments"

So shedding a bit of light on that may help a bit.

Thankyou for any help you can give me!
 
Physics news on Phys.org
  • #2
Careful with what you call matrices and tensors. A tensor (of rank 2 or lower) can be represented by listing its components in a given basis in a matrix. It does not make a matrix a tensor. Many other things that are not tensors can also be represented by matrices. One such example would be the transformation coefficients under coordinate transformations.
WelshieTheWhite said:
Based on the previous logic, the metric tensor should be a tensor of order 2, and hence be a matrix, but it returns a scalar result which is order 0.
As it should. It is a linear map taking two vectors to the real numbers. This is (one possible) the definition of a (0,2) tensor.

WelshieTheWhite said:
And does that mean that all tensors are scalars with appropriate components, or can an order 2 tensor potentially be a matrix as well as its components?
This does not make much sense to me. A tensor is not a matrix, nor is it a scalar unless it has rank zero.
 
  • Like
Likes vanhees71
  • #3
Ok, thanks for your reply, it made me go back to basics and rethink it.

I think I confused the capacity for a second order tensor to have its components represented as a matrix with the tensor ITSELF actually being a matrix. So it makes sense that the metric tensor can have its components represented as a matrix, but as you said it's just a linear map taking two vectors to a real number.

In that sense, is it possible to have a (0,2) tensor that maps from a vector space to a vector space and the tensor to be a vector quantity?
 
  • #4
WelshieTheWhite said:
In that sense, is it possible to have a (0,2) tensor that maps from a vector space to a vector space and the tensor to be a vector quantity?
It is not very clear what you mean, but I am going to try to decipher it. A tensor is always a tensor of its given type, it is never a scalar (unless rank 0) or a vector (unless rank 1). A tensor of type (0,2) defines a bilinear mapping taking two tangent vectors to a scalar. For any (0,2) tensor ##T## and tangent vector ##X##, you can define the map ##\omega(Y) = g(Y,X)##, where ##Y## is a tangent vector. For this reason, ##\omega## is a dual vector (it is a linear map from tangent vectors to scalars) and ##g## also defines a linear map from the tangent vector space to the dual vector space, ##X \to g(\cdot, X)##.

The tensor itself is however not a vector, nor is it a scalar, it is a rank 2 tensor.
 
  • #5
WelshieTheWhite said:
I understand that the order (or rank) of a tensor is the dimensionality of the array required to describe it's components

No, that's not correct. The correct statement is that the order or rank of a tensor is the number of arguments you have to give it in order to get back a number. For example, a rank (0, 2) tensor has to be given two vector arguments in order to give back a number.

It's very important not to confuse tensors themselves with their representations (a matrix of tensor components is a representation). The important facts about tensors can be defined and stated without ever having to choose a representation, i.e., without ever having to work with the components of the tensor or even know that a tensor can have components in a particular representation.
 
  • #6
One should start with the definition of a tensor of Rank ##r##. You have a ##d##-dimensional vector space ##V## over the real numbers as the scalars. Then you a multi-linear form ##T:V^r \rightarrow \mathbb{R}## is by definition a tensor of rank ##r##.

Now take a basis ##\vec{b}_k## (##k \in \{1,\ldots,d \}##). Then any vector can be uniquely decomposed in terms of this basis
$$\vec{x}=x^k \vec{b}_k,$$
where summation over repeated indices is implied (Einstein summation convention).

Now since the tensor is a multilinear mapping to know its value for any given set of ##r## vectors, it's sufficient to know the ##d^r## numbers, the components of the tensor with respect to the given basis,
$$T_{k_1\ldots k_r}=T(\vec{b}_{k_1},\ldots,\vec{b}_{k_r}),$$
because then obviously
$$T(\vec{x}_1,\ldots,\vec{x}_r)=T_{k_1 \ldots k_r} x_1^{k_1} \cdots x_r^{k_r}.$$
That's it.

As you see the tensor as a mapping is independent of the basis, but the tensor components of course depend on that basis. If you introduce another basis ##\vec{b}_k'## with
$$\vec{b}_j=\vec{b}_k' {B^k}_j.$$
Then you have
$$T_{j_1\ldots j_r}=T(\vec{b}_{j_1},\ldots,\vec{b}_{j_r})={B^{k_1}}_{j_1} \cdots {B^{k_r}}_{j_r} T_{k_1\ldots k_r}'.$$
A set of components transforming like this are called to transform in the covariant way, i.e., like the basis vectors.

For the vector components you have
$$\vec{x}=x^j \vec{b}_j=x^j {B^k}_j b_{k}' \; \Rightarrow \; x^{\prime k}={B^k}_j x^j.$$
Such components you say to transform in the contravariant way.

Now this co- and contravariant transformation behaviors of the tensor components (lower indices!) and vector components (upper indices) conspire just such that indeed the value of the tensor given its arguments doesn't change, as it should be
$$T(\vec{x}_1,\ldots,\vec{x}_r)=T_{j_1 \cdots j_r} x_1^{j_1}\cdots x_r^{j_r} = T_{k_1 \cdots k_r} '{B^{k_1}}_{j_1} \cdots {B^{k_r}}_{j_r} x_1^{j_1}\cdots x_r^{j_r} = T_{k_1 \cdots k_r}' x_1^{\prime k_1} \cdots x_r^{\prime k_r}.$$

That's how far you an get without introducing more ideas. The next logical step is to introduce covectors as the tensors of rank 1, which form also a ##d## dimensional vector space. Then you can define the dual basis of the covector space to a given basis of the vector space, and contravariant tensor components.

Then there's a special class of vector spaces where you have defined some non-degenerate bilinear form. Examples are the Euclidean space, where you have a positive definite bilinear form or Minkowski space, where you have an indefinite bilinear form of signature (1,3). Then you can define a canonical (i.e., basis independent) mapping between vectors and co vectors and so on.
 
  • #7
WelshieTheWhite said:
[tex]g(\vec{A},\vec{B}) = \vec{A}\cdot\vec{B} = A^αB^βη_{αβ}[/tex]

Where the summation convention is implied. This is where I start getting confused. Based on the previous logic, the metric tensor should be a tensor of order 2, and hence be a matrix, but it returns a scalar result which is order 0. I understand that η represents the components of the metric tensor in a matrix form (in spacetime):

Summing over an index reduces the rank of the tensor, so summing over two indices gives a rank 0 or scaler.
 

1. What is the definition of a tensor?

A tensor is a mathematical object that represents a physical quantity that has multiple components and can change in different directions. It is described by its order, which is the number of indices needed to represent it, and its valence, which is the number of contravariant and covariant indices.

2. How is the order of a tensor determined?

The order of a tensor is determined by the number of indices needed to represent it. For example, a scalar (a single number) is a tensor of order 0, a vector (a list of numbers) is a tensor of order 1, and a matrix (a 2-dimensional array of numbers) is a tensor of order 2.

3. What is the difference between contravariant and covariant indices?

A contravariant index is used to represent a component of a tensor that transforms inversely to the coordinate system, while a covariant index transforms in the same way as the coordinate system. In other words, a contravariant index changes with a change in basis, while a covariant index remains the same.

4. How is the valence of a tensor determined?

The valence of a tensor is determined by the number of contravariant and covariant indices. For example, a vector has one contravariant index and zero covariant indices, giving it a valence of 1. A tensor of order 3 with two contravariant indices and one covariant index would have a valence of 2.

5. What is the importance of tensors in physics?

Tensors are an essential tool in physics for describing physical quantities such as force, momentum, and stress. They allow for the representation of these quantities in multiple dimensions and can be used to describe how they change in different directions. Tensors are used extensively in fields such as mechanics, electromagnetism, and relativity.

Similar threads

  • Special and General Relativity
Replies
25
Views
990
  • Special and General Relativity
Replies
4
Views
967
  • Special and General Relativity
Replies
4
Views
2K
  • Special and General Relativity
Replies
5
Views
269
  • Special and General Relativity
Replies
4
Views
907
  • Special and General Relativity
Replies
1
Views
544
  • Special and General Relativity
Replies
9
Views
2K
  • Special and General Relativity
Replies
18
Views
365
  • Special and General Relativity
Replies
2
Views
894
  • Special and General Relativity
Replies
4
Views
698
Back
Top