# Matrices and tensors

could someone please explain the difference or non-difference of matrices and tensors? i come across the two plenty in various fields of physics and am curious. i have a feeling this question has been asked and answered before, but i could not find a previous thread, so pointing me to another post would also be appreciated. thanks.

-wc

lurflurf
Homework Helper
wintercarver said:
could someone please explain the difference or non-difference of matrices and tensors? i come across the two plenty in various fields of physics and am curious. i have a feeling this question has been asked and answered before, but i could not find a previous thread, so pointing me to another post would also be appreciated. thanks.

-wc
A matrix that satisfys certain tranform rules can be thought of as representing a tensor of rank 2. Tensors can have rank 0,1,2,3,4...
Thus scalars and vectors are tensors of ranks 0 and 1 respectively. Thus tensors are in a sense more general than matrix's as they are not representations, and they can have any rank. They are also less general in a sense as they are restricted by transform laws.

mathwonk
Homework Helper
2020 Award
the word tensor has two meanings, it is both a verb and a noun. it is a sort of product that can be performed between two vector spaces, and then it is an element of such a product.

when you tensor multiply a vector space V by the dual W* of another vector space W, the result maps naturally to the space of linear transformations from W to V.

when the spaces are finite dimensional, this map is an isomorphism. one also knows that this space of linear transformations is isomorphic, non naturally, to the space of matrices of size dim(V) by dim(W).

i myself am not too up on this rank language, but my impression from reading other people posts is that when the space V is always the same, say R^n, then one multiplies together exclusively copies of V and V*, and the rank refers to the number of copies of each one.

thus i would have thought such users would have called a tensor with one copy of each, i.e. a matrix, a "rank (1,1) tensor", whereas rank 2 would have meant a product of two copies of V.

of course for people who routinely choose bases, the distinction between V and V* is much less clear, hence one cannot distinguish between rank(1,1) and rank 2.

but that is not my area.

Last edited:
Hurkyl
Staff Emeritus
Gold Member
I'm sure I've heard a "rank (m, n) tensor" also called a "rank m+n tensor" before.

mathwonk
Homework Helper
2020 Award
yes no doubt that explains it.

jcsd
Gold Member
A tensor of rank (q,r) on a vector space of dim(n) over a field Q forms a vector space of dim(n^(q+r)), so the set tensors of rank (1,1), (2,0) and (0,2) all form vector spaces of dim(n^2), simlairly the set of nxn matrices over a field Q form a vector space of dim(n^2), so in terms of structure as vector spaces (i.e. when addition, subtraction and scalar multiplication is concerned) there is no difference between a rank 2 tensor and an nxn matrix. Both tensors and matrices have more properties (that much should be obvious from the fact that we choose to distnguish rank (1,1) and rank (2,0), etc tensors) than just those associated with the basic properties vector spaces they form, so a rank 2 tensor is not the same as an nxn matrix (though they do share other properties which do extend the usefulness of representing rank 2 tensors as matrices beyond addition, subtraction and scalar multiplication).

the difference is clear mathematically, but subtle if you are using them in an applied way.

consider the metric tensor for example:

$$G = g_{ij} dx^i \otimes dx^j$$

some people call [itex]g_{ij}[/tex] (the elements of a matrix) the metric tensor, but this is not mathematically correct, they form the components of the metric tensor. this is analogous to calling the components of a vector the vector itself.

the reason for this is since

$$dx^i(\vec{v}) = v^i$$

then

$$G(\vec{u}, \vec{v}) = u^i g_{ij} v^j$$

which is nothing more than

$$\begin{pmatrix}u^1 & u^2 & ...\end{pmatrix}\begin{pmatrix}g_{11} & g_{12} & ... \\ g_{21} & g_{22} & ... \\ ... & ... & ...\end{pmatrix}\begin{pmatrix}v^1 \\ v^2 \\ ...\end{pmatrix}$$

mathwonk
Homework Helper
2020 Award
now through the magic of televisison, mr science will demonstrate how to use the dot product to change a 2 tensor into a 1,1, tensor! look children, vtensw appears to be a 2 tensor, but behold, if t is a vector then we can wave our dot product wand and say: presto: you are a 1,1 tensor and now when vtensw sees the vector t,

it pounces and yields the vector <v,t>w !!@.

observant boys and girls can no doubt guess how to change it as well into a 2 tensor of opposite variance, yielding a number, when confronting a pair of vectors, (s,t) ??

mirabile dictu.

the moral is, one recognizes a wolf only after it tries to eat the sheep.

i.e. one often does not know what object v(tens)w is until the great oz says so: i.e. if you say it is a (0,2) tensor then it is, and if you say it is a (1,1) tensor then it is that as well. enjoy the game! (and always be a student of behavior.)

Last edited:
quasar987
Homework Helper
Gold Member
But always wear a helmet while playing this game!

mathwonk
some of us apparently forgot to. as lyndon johnson used to say. 