Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I The Order and Valence of Tensors

  1. Apr 18, 2016 #1
    I'm having a bit of trouble understanding the nature of tensors (which is pretty central to the gen rel course I'm currently taking).

    I understand that the order (or rank) of a tensor is the dimensionality of the array required to describe it's components, i.e. a 0 rank tensor is a scalar, a 1 rank tensor is a vector (so just a one dimensional array or "line" of components), a 2 rank tensor is a matrix (2 dimensional array) etc.

    Additionally, I understand the valence of a tensor depends on how many covariant and contravariant vector arguments it has (old terminology, I believe new terminology is vector and dual-vector), i.e. a (0,1) tensor has one vector argument and a (1,0) tensor has one dual vector argument. Also, the total order of the tensor is the sum of the number of the arguments, i.e. a (1,1) tensor has order 2.

    Now take the metric tensor, which is a tensor with valence (0,2). It takes two vector arguments and provides the scalar product of the two of them.

    [tex]g(\vec{A},\vec{B}) = \vec{A}\cdot\vec{B} = A^αB^βη_{αβ}[/tex]

    Where the summation convention is implied. This is where I start getting confused. Based on the previous logic, the metric tensor should be a tensor of order 2, and hence be a matrix, but it returns a scalar result which is order 0. I understand that η represents the components of the metric tensor in a matrix form (in spacetime):

    [tex]η_{αβ} = \begin{pmatrix}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{pmatrix}[/tex]

    Which IS of order 2. Does this mean that an order 2 tensor doesn't necessarily have to be a matrix itself, but it's components do? And does that mean that all tensors are scalars with appropriate components, or can an order 2 tensor potentially be a matrix as well as its components?

    Sorry if this is a bit all over the place. I think one source of my confusion comes from this passage in the textbook "A First Course in General Relativity (2nd Edition)" by Schutz:

    "A tensor of type (0,N) is a function of N vectors into the real numbers, which is linear in each of its N arguments"

    So shedding a bit of light on that may help a bit.

    Thankyou for any help you can give me!
     
  2. jcsd
  3. Apr 18, 2016 #2

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Careful with what you call matrices and tensors. A tensor (of rank 2 or lower) can be represented by listing its components in a given basis in a matrix. It does not make a matrix a tensor. Many other things that are not tensors can also be represented by matrices. One such example would be the transformation coefficients under coordinate transformations.
    As it should. It is a linear map taking two vectors to the real numbers. This is (one possible) the definition of a (0,2) tensor.

    This does not make much sense to me. A tensor is not a matrix, nor is it a scalar unless it has rank zero.
     
  4. Apr 18, 2016 #3
    Ok, thanks for your reply, it made me go back to basics and rethink it.

    I think I confused the capacity for a second order tensor to have its components represented as a matrix with the tensor ITSELF actually being a matrix. So it makes sense that the metric tensor can have its components represented as a matrix, but as you said it's just a linear map taking two vectors to a real number.

    In that sense, is it possible to have a (0,2) tensor that maps from a vector space to a vector space and the tensor to be a vector quantity?
     
  5. Apr 18, 2016 #4

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    It is not very clear what you mean, but I am going to try to decipher it. A tensor is always a tensor of its given type, it is never a scalar (unless rank 0) or a vector (unless rank 1). A tensor of type (0,2) defines a bilinear mapping taking two tangent vectors to a scalar. For any (0,2) tensor ##T## and tangent vector ##X##, you can define the map ##\omega(Y) = g(Y,X)##, where ##Y## is a tangent vector. For this reason, ##\omega## is a dual vector (it is a linear map from tangent vectors to scalars) and ##g## also defines a linear map from the tangent vector space to the dual vector space, ##X \to g(\cdot, X)##.

    The tensor itself is however not a vector, nor is it a scalar, it is a rank 2 tensor.
     
  6. Apr 18, 2016 #5

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    No, that's not correct. The correct statement is that the order or rank of a tensor is the number of arguments you have to give it in order to get back a number. For example, a rank (0, 2) tensor has to be given two vector arguments in order to give back a number.

    It's very important not to confuse tensors themselves with their representations (a matrix of tensor components is a representation). The important facts about tensors can be defined and stated without ever having to choose a representation, i.e., without ever having to work with the components of the tensor or even know that a tensor can have components in a particular representation.
     
  7. Apr 18, 2016 #6

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    One should start with the definition of a tensor of Rank ##r##. You have a ##d##-dimensional vector space ##V## over the real numbers as the scalars. Then you a multi-linear form ##T:V^r \rightarrow \mathbb{R}## is by definition a tensor of rank ##r##.

    Now take a basis ##\vec{b}_k## (##k \in \{1,\ldots,d \}##). Then any vector can be uniquely decomposed in terms of this basis
    $$\vec{x}=x^k \vec{b}_k,$$
    where summation over repeated indices is implied (Einstein summation convention).

    Now since the tensor is a multilinear mapping to know its value for any given set of ##r## vectors, it's sufficient to know the ##d^r## numbers, the components of the tensor with respect to the given basis,
    $$T_{k_1\ldots k_r}=T(\vec{b}_{k_1},\ldots,\vec{b}_{k_r}),$$
    because then obviously
    $$T(\vec{x}_1,\ldots,\vec{x}_r)=T_{k_1 \ldots k_r} x_1^{k_1} \cdots x_r^{k_r}.$$
    That's it.

    As you see the tensor as a mapping is independent of the basis, but the tensor components of course depend on that basis. If you introduce another basis ##\vec{b}_k'## with
    $$\vec{b}_j=\vec{b}_k' {B^k}_j.$$
    Then you have
    $$T_{j_1\ldots j_r}=T(\vec{b}_{j_1},\ldots,\vec{b}_{j_r})={B^{k_1}}_{j_1} \cdots {B^{k_r}}_{j_r} T_{k_1\ldots k_r}'.$$
    A set of components transforming like this are called to transform in the covariant way, i.e., like the basis vectors.

    For the vector components you have
    $$\vec{x}=x^j \vec{b}_j=x^j {B^k}_j b_{k}' \; \Rightarrow \; x^{\prime k}={B^k}_j x^j.$$
    Such components you say to transform in the contravariant way.

    Now this co- and contravariant transformation behaviors of the tensor components (lower indices!) and vector components (upper indices) conspire just such that indeed the value of the tensor given its arguments doesn't change, as it should be
    $$T(\vec{x}_1,\ldots,\vec{x}_r)=T_{j_1 \cdots j_r} x_1^{j_1}\cdots x_r^{j_r} = T_{k_1 \cdots k_r} '{B^{k_1}}_{j_1} \cdots {B^{k_r}}_{j_r} x_1^{j_1}\cdots x_r^{j_r} = T_{k_1 \cdots k_r}' x_1^{\prime k_1} \cdots x_r^{\prime k_r}.$$

    That's how far you an get without introducing more ideas. The next logical step is to introduce covectors as the tensors of rank 1, which form also a ##d## dimensional vector space. Then you can define the dual basis of the covector space to a given basis of the vector space, and contravariant tensor components.

    Then there's a special class of vector spaces where you have defined some non-degenerate bilinear form. Examples are the Euclidean space, where you have a positive definite bilinear form or Minkowski space, where you have an indefinite bilinear form of signature (1,3). Then you can define a canonical (i.e., basis independent) mapping between vectors and co vectors and so on.
     
  8. Apr 18, 2016 #7
    Summing over an index reduces the rank of the tensor, so summing over two indices gives a rank 0 or scaler.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: The Order and Valence of Tensors
  1. Order of tensor indices (Replies: 10)

  2. Metric tensor (Replies: 6)

Loading...