# Mathematical Definition of Tensor?

Gold Member
One encounters tensors in physics writings about SR, GR and relativistic QM. But physicists do not seem to describe tensors in a precise, purely mathematical way. Is there a precise, purely mathematical definition of the same tensors that physicists refer to? Thanks in advance.

Last edited:

Hurkyl
Staff Emeritus
Gold Member
Behold, the tensor product.

Differential geometry starts with three basic objects:
(1) The line bundle -- the thing in which scalar fields live
(2) The tangent bundle -- the thing in which vector fields live
(3) The cotangent bundle -- the thing in which covector fields live

And from there uses the tensor product to construct other interesting bundles. The sections of those bundles are what you would call "tensor fields".

One of the easiest ways to think about a tensor is simply as a multilinear, real-valued function defined on the cartesian product of some number of copies of a vector space and its dual space. e.g. a rank (0,2) tensor assigns a real number to two vectors and is multilinear in each argument. An example of this is an n x n matrix A which acts on two (column) vectors v and w by v^t A w.

One encounters tensors in physics writings about SR, GR and relativistic QM. But physicists do not seem to describe tensors in a precise, purely mathematical way. Is there a precise, purely mathematical definition of the same tensors that physicists refer to? Thanks in advance.

In Physics tensors are defined in terms of local coordinates. This definition is rigorous and purely mathematical but is a pain to work with until you get used to it.

Tensors are multilinear maps of a vector space and its dual into the base field, usually the real or the compex numbers. On a manifold, these maps exist at each tangent space and generally are required to vary smoothly from point to point. When one changes coordinates the local expressions for these maps change according to the transformation rules described in Physics books.

It is worth working out how the Physics definition really is just a local coordinate description of multilinear maps.

The way I learned it was to take a basis for the vector space, add on its dual basis and just check that tensor tranformation rules are just change of coordinates formulas.

Much of modern mathematics now uses the Physics ways of describing tensors. For instance if you look at Richard Hamilton's papers on Ricci flow you see the local coordinate notation. In fact, much of the work on PDE's on manifolds is expressed in the Physics style. There is no escaping it.

I think that Physicists came up with the local coordinate method because they asked questions about when measurements are independent of observers. If you think of observers as defining different frames of reference then you want a physical measurement to be the same in every frame. This means that the formulas for these measurements must transform in such a way that in the new coordinates they yield the same value. This is precisely what tensors do. Hermann Weyl's classic book, Space, Time, Matter discusses this interesting line of thought.

Last edited:
Hurkyl
Staff Emeritus
Gold Member
It's dangerous to fix one realization of tensors in your mind -- one of the great strengths of multilinear algebra is the wealth of different ways objects interact.

Consider plain boring old linear algebra; if you have a linear transformation $T : V \to W$, what can you do with it? Well, the most obvious thing is you can "multiply" it with a vector $v \in V$ and get a new vector $Tv \in W$. But you can also multiply it against a vector $\omega \in W^*$ and compute a new vector $\omega T \in V^*$. And what if you had a linear transformation $S : W \to X$? You can multiply to get a new linear transformation $ST : V \to X$. What about linear transformations $R : U \to V$? You can multiply to get a linear transformation $TR : U \to W$. What if you had yet another linear transformation $Q : U \to X$? You can take the direct sum to get a linear transformation $T \oplus Q : V \oplus U \to W \oplus X$....

In linear algebra, it's important not to get yourself locked into the "T is a function that turns vectors in V into vectors in W" mindset, because it is very important to be able to mentally use T in all these different ways without so much as blinking.

Multilinear algebra gives you even more options to play with.

Gold Member
A tensor seems to be a generalization of the old vector dot product operation. General in the sense that the dimension of the vector space can be greater than 3 and general in the sense that the "operation" can be between more than 2 vectors taken at a time. Does that make any sense?

You make some sense in that tensor analysis is a generalization of vector analysis, but you must not forget the fact that a tensor is an object which is independent of transformation of coordinates. The number of components of a vector is different from a tensor of higher rank. A vector is tensor of rank 1. Thus in a three dimensional space, the number of components is 3, while for a rank 2 tensor in a three dimensinal space, it's number of components is 9. They could have the same dimension but they surely have different component in general.
N.B. It is the transformation law that is the enssence of tensor analysis.

For me a tensor is measure of a physical entity that remains invariant under coordinate transformations (as the physical entity is independent of the existence of a coordinate system), and thus, its components in a given coordinate system (basis) must obey certain transformation laws under a change of coordinate system so that precisely the tensor itself remains invariant. As de_brook said, a vector is simply a tensor of rank 1, and a scalar is a tensor of rank 0. The stress in a continuous medium is described at each point by a tensor of rank 2.

What defines whether a given set of quantities forms the components of a tensor is precisely whether or not they obey the transformation laws.