What is a Tensor? Explanations & Understanding

  • Thread starter VivaLaFisica
  • Start date
  • Tags
    Tensor
In summary, a tensor can be thought of as an n-dimensional grid of numbers with operations such as addition and multiplication that work similar to matrix algebra. However, the crucial point is that tensors have an independent existence from any given coordinate system and their change as the coordinate system changes is "linear homogeneous."
  • #1
VivaLaFisica
16
1
I'm a bit confused not just on what exactly a tensor is but also how I should be thinking about a tensor. I figure it's just a vector's bigger brother, but I'm just trying to wrap my head around it.

Any explanations are appreciated.

Thanks in advance.
 
Mathematics news on Phys.org
  • #2
I figure it's just a vector's bigger brother

A vector's bigger brother is actually a matrix. A tensor is like the bad uncle that used to bully you and try to make you think that you were stupid because you couldn't tell the difference between co-variance and contra-variance.
 
  • #3
DiracPool said:
A vector's bigger brother is actually a matrix. A tensor is like the bad uncle that used to bully you and try to make you think that you were stupid because you couldn't tell the difference between co-variance and contra-variance.

I'll try harder, uncle Paul -- I promise!

But really, this tensor business is boggling me.
 
  • #4
If your teacher has told you something like "it's something that transforms under rotations according to the tensor transformation law", it's understandable that you're confused, because what does that statement even mean? It's been almost 20 years since someone tried to explain tensors to me that way, and it still really irritates me when I think about it.

I will explain one way to interpret that statement for the case of "covariant vectors" and "contravariant vectors". (I really hate those terms).

Consider a function T that takes coordinate systems to ordered triples of real numbers. We can choose to denote the members of the triple corresponding to a coordinate system S by T(S)i, or by T(S)i. If S' is another coordinate system, that can be obtained from S by a rotation, then it might be useful to have formulas for T(S')i in terms of the T(S)i, and for T(S')i in terms of the T(S)i. Those formulas can be thought of as describing how (T(S)1, T(S)2, T(S)3) and (T(S)1, T(S)2, T(S)3) "transform" when the coordinate system is changed from S to S'. It's possible that neither of those formulas looks anything like the tensor transformation law, but it's also possible that one of them looks exactly like the tensor transformation law.

If that's the case with the formula for T(S')i, then some people call the triple (T(S)1, T(S)2, T(S)3) a "contravariant vector". And if it's the case with the other formula, then some people call the triple (T(S)1, T(S)2, T(S)3) a "covariant vector". I think that terminology is absolutely horrendous. The information about how the triple will "transform" isn't present in the triple itself. It's in the function T and the choice about where to put the indices. So if anything here should be called a "contravariant vector", it should be something like the pair (T,"upstairs"). I also don't like that this defines "vector" in a way that has nothing to do with the standard definition in mathematics: A vector is a member of a vector space.

If you understand this, then it's not too hard to understand how this generalizes to other tensors. It's just hard to explain it.

I consider this entire approach to tensors obsolete. A much better approach is to define tensors as multilinear functions that take tuples of tangent vectors and cotangent vectors to real numbers. This post and the ones linked to at the end explain some of the basics. (If you read it, skip the first two paragraphs. You can also ignore everything that mentions the metric if you want to. What you need is an understanding of the terms "manifold", "tangent space", "cotangent space" and "dual basis").

Fredrik said:
Let V be a finite-dimensional vector space, and V* its dual space. A tensor of type (n,m) is a multilinear map [tex]T:\underbrace{V^*\times\cdots\times V^*}_{\text{$n$ factors}}\times\underbrace{V\times\cdots\times V}_{\text{$m$ factors}}\rightarrow\mathbb R.[/tex] The components of the tensor in a basis [itex]\{e_i\}[/itex] for V are the numbers [tex]T^{i_1\dots i_n}{}_{j_1\dots j_m}=T(e^{i_1},\dots,e^{i_n},e_{j_1},\dots,e_{j_m}),[/tex] where the [itex]e^i[/itex] are members of the dual basis of [itex]\{e_i\}[/itex]. The multilinearity of T ensures that the components will change in a certain way when you change the basis. The rule that describes that change is called "the tensor transformation law".
 
  • #5
My explanation is that a tensor is an n dimensional grid of numbers. Since a scalar is a 0d grid, a vector is a 1d grid, a matrix is a 2d grid, we can say that a tensor is any of these, plus 3d grids, 4d grids etc.

But it isn't just a grid of numbers, it also has a few operations on it, such as addition and multiplication, which work like matrix algebra in the 2d case, and real number algebra in the 0d case, etc.
 
  • #6
Also, you might want to consider the analog to the Cartesian product of vector spaces (or linear spaces) to get a "multi"-linear object (hence the term multi-linear algebra which is another word for tensors).
 
  • #7
DiracPool said:
A vector's bigger brother is actually a matrix. A tensor is like the bad uncle that used to bully you and try to make you think that you were stupid because you couldn't tell the difference between co-variance and contra-variance.

TGlad said:
My explanation is that a tensor is an n dimensional grid of numbers. Since a scalar is a 0d grid, a vector is a 1d grid, a matrix is a 2d grid, we can say that a tensor is any of these, plus 3d grids, 4d grids etc.

But it isn't just a grid of numbers, it also has a few operations on it, such as addition and multiplication, which work like matrix algebra in the 2d case, and real number algebra in the 0d case, etc.
These are both quite a bit misleading. In a given coordinate system, we can represent a vector as a "list" of numbers and we can represent a tensor by a rectanguar grid of numbers or matrix. But the crucial point of both vectors and tensors (technically, a vector is a tensor) is that they have an existence independent of any given coordinate system. And the change as we change from one coordinate system to another is "linear homogeneous". One reason that is important is that if a tensor happens to be all 0s in one coordinate system, it is all 0s in any coordinate system. So if we have an equation, say A= B where A and B are tensors, in one coordinate system, we can write that as A- B= 0 so that A- B= 0, i.e. A= B in any coordinate system: tensor equations are independent of the coordinate system.
 
  • #8
TGlad said:
My explanation is that a tensor is an n dimensional grid of numbers. Since a scalar is a 0d grid, a vector is a 1d grid, a matrix is a 2d grid, we can say that a tensor is any of these, plus 3d grids, 4d grids etc.

But it isn't just a grid of numbers, it also has a few operations on it, such as addition and multiplication, which work like matrix algebra in the 2d case, and real number algebra in the 0d case, etc.
You left out the most important thing, which is that one of these "grids" isn't enough. There must be one for each coordinate system. And the relationship between the "grids" associated with two different coordinate systems must be given by the tensor transformation law.
 
  • #9
But the crucial point of both vectors and tensors is that they have an existence independent of any given coordinate system.
So is calling a real a number misleading? since reals have an existence independent of coordinate system (which is just scale in 1d).

There must be one for each coordinate system. And the relationship between the "grids" associated with two different coordinate systems must be given by the tensor transformation law
I did mention that tensors come with a set of operators that have specific rules, which are basically like real, vector and matrix algebra in the 0d, 1d and 2d cases. I didn't suggest that they are just data structures.
Are you suggesting that you need something more for tensors that you don't need for say just matrices? Matrices can be transformed between coordinate systems.
 
Last edited:
  • #10
TGlad said:
Are you suggesting that you need something more for tensors that you don't need for say just matrices?
Yes. A matrix is not a tensor. You can however define a tensor by specifying an n×n matrix, a coordinate system and the tensor type (i.e. if you want the tensor to be a map from V×V into ℝ, from V*×V into ℝ, from V×V* into ℝ, or from V*×V* into ℝ), because there's exactly one tensor of each type that has components in the chosen coordinate system that are equal to the components of the matrix.

TGlad said:
Matrices can be transformed between coordinate systems.
Right, ##A'=RAR^{-1}##. You can define a type (1,1) tensor by saying that that we associate each rotation matrix with a coordinate system, and that we for each rotation matrix R, associate the matrix ##RAR^{-1}## with the coordinate system associated with R. But you can also define a type (2,0) tensor by instead associating the matrix ##RAR## with the coordinate system associated with R.
 
Last edited:

What is a Tensor?

A tensor is a mathematical object that describes the relationships between different vectors and scalars. It is a multidimensional array of numbers or functions that can represent a wide range of physical phenomena and can be used to model complex systems in various fields such as physics, engineering, and computer science.

What are the different types of Tensors?

There are three main types of tensors: scalars, vectors, and matrices. Scalars are tensors with a single value, vectors are tensors with both magnitude and direction, and matrices are tensors with multiple rows and columns.

How are Tensors used in Machine Learning?

In machine learning, tensors are used to represent data and perform computations on that data. They are the primary data structure used in neural networks, a type of machine learning algorithm that can learn and make predictions from the data it is given.

What is the difference between Tensors and Matrices?

Tensors are a generalization of matrices, which are limited to two dimensions. Tensors can have any number of dimensions, making them more versatile and applicable to a wider range of problems. Additionally, while matrices can only hold numerical values, tensors can also hold other types of data, such as images or text.

Why are Tensors important in Physics?

Tensors are essential in physics because they provide a way to express and understand the complex relationships between physical quantities. They are used to describe different physical properties, such as forces, velocities, and energy, and are crucial in formulating and solving equations in fields such as relativity and quantum mechanics.

Similar threads

  • Quantum Physics
Replies
11
Views
1K
  • Special and General Relativity
Replies
25
Views
882
  • Linear and Abstract Algebra
Replies
2
Views
880
  • General Math
Replies
2
Views
979
  • Special and General Relativity
Replies
6
Views
1K
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
0
Views
529
Replies
16
Views
2K
Replies
2
Views
2K
  • Advanced Physics Homework Help
Replies
5
Views
2K
Back
Top