What is a Tensor? Explanations & Understanding

  • Thread starter Thread starter VivaLaFisica
  • Start date Start date
  • Tags Tags
    Tensor
VivaLaFisica
Messages
16
Reaction score
1
I'm a bit confused not just on what exactly a tensor is but also how I should be thinking about a tensor. I figure it's just a vector's bigger brother, but I'm just trying to wrap my head around it.

Any explanations are appreciated.

Thanks in advance.
 
Mathematics news on Phys.org
I figure it's just a vector's bigger brother

A vector's bigger brother is actually a matrix. A tensor is like the bad uncle that used to bully you and try to make you think that you were stupid because you couldn't tell the difference between co-variance and contra-variance.
 
DiracPool said:
A vector's bigger brother is actually a matrix. A tensor is like the bad uncle that used to bully you and try to make you think that you were stupid because you couldn't tell the difference between co-variance and contra-variance.

I'll try harder, uncle Paul -- I promise!

But really, this tensor business is boggling me.
 
If your teacher has told you something like "it's something that transforms under rotations according to the tensor transformation law", it's understandable that you're confused, because what does that statement even mean? It's been almost 20 years since someone tried to explain tensors to me that way, and it still really irritates me when I think about it.

I will explain one way to interpret that statement for the case of "covariant vectors" and "contravariant vectors". (I really hate those terms).

Consider a function T that takes coordinate systems to ordered triples of real numbers. We can choose to denote the members of the triple corresponding to a coordinate system S by T(S)i, or by T(S)i. If S' is another coordinate system, that can be obtained from S by a rotation, then it might be useful to have formulas for T(S')i in terms of the T(S)i, and for T(S')i in terms of the T(S)i. Those formulas can be thought of as describing how (T(S)1, T(S)2, T(S)3) and (T(S)1, T(S)2, T(S)3) "transform" when the coordinate system is changed from S to S'. It's possible that neither of those formulas looks anything like the tensor transformation law, but it's also possible that one of them looks exactly like the tensor transformation law.

If that's the case with the formula for T(S')i, then some people call the triple (T(S)1, T(S)2, T(S)3) a "contravariant vector". And if it's the case with the other formula, then some people call the triple (T(S)1, T(S)2, T(S)3) a "covariant vector". I think that terminology is absolutely horrendous. The information about how the triple will "transform" isn't present in the triple itself. It's in the function T and the choice about where to put the indices. So if anything here should be called a "contravariant vector", it should be something like the pair (T,"upstairs"). I also don't like that this defines "vector" in a way that has nothing to do with the standard definition in mathematics: A vector is a member of a vector space.

If you understand this, then it's not too hard to understand how this generalizes to other tensors. It's just hard to explain it.

I consider this entire approach to tensors obsolete. A much better approach is to define tensors as multilinear functions that take tuples of tangent vectors and cotangent vectors to real numbers. This post and the ones linked to at the end explain some of the basics. (If you read it, skip the first two paragraphs. You can also ignore everything that mentions the metric if you want to. What you need is an understanding of the terms "manifold", "tangent space", "cotangent space" and "dual basis").

Fredrik said:
Let V be a finite-dimensional vector space, and V* its dual space. A tensor of type (n,m) is a multilinear map T:\underbrace{V^*\times\cdots\times V^*}_{\text{$n$ factors}}\times\underbrace{V\times\cdots\times V}_{\text{$m$ factors}}\rightarrow\mathbb R. The components of the tensor in a basis \{e_i\} for V are the numbers T^{i_1\dots i_n}{}_{j_1\dots j_m}=T(e^{i_1},\dots,e^{i_n},e_{j_1},\dots,e_{j_m}), where the e^i are members of the dual basis of \{e_i\}. The multilinearity of T ensures that the components will change in a certain way when you change the basis. The rule that describes that change is called "the tensor transformation law".
 
My explanation is that a tensor is an n dimensional grid of numbers. Since a scalar is a 0d grid, a vector is a 1d grid, a matrix is a 2d grid, we can say that a tensor is any of these, plus 3d grids, 4d grids etc.

But it isn't just a grid of numbers, it also has a few operations on it, such as addition and multiplication, which work like matrix algebra in the 2d case, and real number algebra in the 0d case, etc.
 
Also, you might want to consider the analog to the Cartesian product of vector spaces (or linear spaces) to get a "multi"-linear object (hence the term multi-linear algebra which is another word for tensors).
 
DiracPool said:
A vector's bigger brother is actually a matrix. A tensor is like the bad uncle that used to bully you and try to make you think that you were stupid because you couldn't tell the difference between co-variance and contra-variance.

TGlad said:
My explanation is that a tensor is an n dimensional grid of numbers. Since a scalar is a 0d grid, a vector is a 1d grid, a matrix is a 2d grid, we can say that a tensor is any of these, plus 3d grids, 4d grids etc.

But it isn't just a grid of numbers, it also has a few operations on it, such as addition and multiplication, which work like matrix algebra in the 2d case, and real number algebra in the 0d case, etc.
These are both quite a bit misleading. In a given coordinate system, we can represent a vector as a "list" of numbers and we can represent a tensor by a rectanguar grid of numbers or matrix. But the crucial point of both vectors and tensors (technically, a vector is a tensor) is that they have an existence independent of any given coordinate system. And the change as we change from one coordinate system to another is "linear homogeneous". One reason that is important is that if a tensor happens to be all 0s in one coordinate system, it is all 0s in any coordinate system. So if we have an equation, say A= B where A and B are tensors, in one coordinate system, we can write that as A- B= 0 so that A- B= 0, i.e. A= B in any coordinate system: tensor equations are independent of the coordinate system.
 
TGlad said:
My explanation is that a tensor is an n dimensional grid of numbers. Since a scalar is a 0d grid, a vector is a 1d grid, a matrix is a 2d grid, we can say that a tensor is any of these, plus 3d grids, 4d grids etc.

But it isn't just a grid of numbers, it also has a few operations on it, such as addition and multiplication, which work like matrix algebra in the 2d case, and real number algebra in the 0d case, etc.
You left out the most important thing, which is that one of these "grids" isn't enough. There must be one for each coordinate system. And the relationship between the "grids" associated with two different coordinate systems must be given by the tensor transformation law.
 
But the crucial point of both vectors and tensors is that they have an existence independent of any given coordinate system.
So is calling a real a number misleading? since reals have an existence independent of coordinate system (which is just scale in 1d).

There must be one for each coordinate system. And the relationship between the "grids" associated with two different coordinate systems must be given by the tensor transformation law
I did mention that tensors come with a set of operators that have specific rules, which are basically like real, vector and matrix algebra in the 0d, 1d and 2d cases. I didn't suggest that they are just data structures.
Are you suggesting that you need something more for tensors that you don't need for say just matrices? Matrices can be transformed between coordinate systems.
 
Last edited:
  • #10
TGlad said:
Are you suggesting that you need something more for tensors that you don't need for say just matrices?
Yes. A matrix is not a tensor. You can however define a tensor by specifying an n×n matrix, a coordinate system and the tensor type (i.e. if you want the tensor to be a map from V×V into ℝ, from V*×V into ℝ, from V×V* into ℝ, or from V*×V* into ℝ), because there's exactly one tensor of each type that has components in the chosen coordinate system that are equal to the components of the matrix.

TGlad said:
Matrices can be transformed between coordinate systems.
Right, ##A'=RAR^{-1}##. You can define a type (1,1) tensor by saying that that we associate each rotation matrix with a coordinate system, and that we for each rotation matrix R, associate the matrix ##RAR^{-1}## with the coordinate system associated with R. But you can also define a type (2,0) tensor by instead associating the matrix ##RAR## with the coordinate system associated with R.
 
Last edited:
Back
Top