- #1

- 3

- 0

## Main Question or Discussion Point

I'm having a bit of trouble understanding the nature of tensors (which is pretty central to the gen rel course I'm currently taking).

I understand that the order (or rank) of a tensor is the dimensionality of the array required to describe it's components, i.e. a 0 rank tensor is a scalar, a 1 rank tensor is a vector (so just a one dimensional array or "line" of components), a 2 rank tensor is a matrix (2 dimensional array) etc.

Additionally, I understand the valence of a tensor depends on how many covariant and contravariant vector arguments it has (old terminology, I believe new terminology is vector and dual-vector), i.e. a (0,1) tensor has one vector argument and a (1,0) tensor has one dual vector argument. Also, the total order of the tensor is the sum of the number of the arguments, i.e. a (1,1) tensor has order 2.

Now take the metric tensor, which is a tensor with valence (0,2). It takes two vector arguments and provides the scalar product of the two of them.

[tex]g(\vec{A},\vec{B}) = \vec{A}\cdot\vec{B} = A^αB^βη_{αβ}[/tex]

Where the summation convention is implied. This is where I start getting confused. Based on the previous logic, the metric tensor should be a tensor of order 2, and hence be a matrix, but it returns a scalar result which is order 0. I understand that η represents the components of the metric tensor in a matrix form (in spacetime):

[tex]η_{αβ} = \begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}[/tex]

Which IS of order 2. Does this mean that an order 2 tensor doesn't necessarily have to be a matrix itself, but it's components do? And does that mean that all tensors are scalars with appropriate components, or can an order 2 tensor potentially be a matrix as well as its components?

Sorry if this is a bit all over the place. I think one source of my confusion comes from this passage in the textbook "A First Course in General Relativity (2nd Edition)" by Schutz:

"A tensor of type (0,N) is a function of N vectors into the real numbers, which is linear in each of its N arguments"

So shedding a bit of light on that may help a bit.

Thankyou for any help you can give me!

I understand that the order (or rank) of a tensor is the dimensionality of the array required to describe it's components, i.e. a 0 rank tensor is a scalar, a 1 rank tensor is a vector (so just a one dimensional array or "line" of components), a 2 rank tensor is a matrix (2 dimensional array) etc.

Additionally, I understand the valence of a tensor depends on how many covariant and contravariant vector arguments it has (old terminology, I believe new terminology is vector and dual-vector), i.e. a (0,1) tensor has one vector argument and a (1,0) tensor has one dual vector argument. Also, the total order of the tensor is the sum of the number of the arguments, i.e. a (1,1) tensor has order 2.

Now take the metric tensor, which is a tensor with valence (0,2). It takes two vector arguments and provides the scalar product of the two of them.

[tex]g(\vec{A},\vec{B}) = \vec{A}\cdot\vec{B} = A^αB^βη_{αβ}[/tex]

Where the summation convention is implied. This is where I start getting confused. Based on the previous logic, the metric tensor should be a tensor of order 2, and hence be a matrix, but it returns a scalar result which is order 0. I understand that η represents the components of the metric tensor in a matrix form (in spacetime):

[tex]η_{αβ} = \begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}[/tex]

Which IS of order 2. Does this mean that an order 2 tensor doesn't necessarily have to be a matrix itself, but it's components do? And does that mean that all tensors are scalars with appropriate components, or can an order 2 tensor potentially be a matrix as well as its components?

Sorry if this is a bit all over the place. I think one source of my confusion comes from this passage in the textbook "A First Course in General Relativity (2nd Edition)" by Schutz:

"A tensor of type (0,N) is a function of N vectors into the real numbers, which is linear in each of its N arguments"

So shedding a bit of light on that may help a bit.

Thankyou for any help you can give me!