Difference between Tensors and matrices

Click For Summary
Tensors and matrices are related but distinct mathematical concepts, with tensors having specific transformation properties that matrices do not possess. While rank 2 tensors can be represented by square matrices, not all tensors are rank 2, and many cannot be represented as matrices at all. Matrices are essentially rectangular arrays of numbers that do not transform under coordinate changes, while tensors must adapt to such transformations. The discussion highlights that tensors are associated with linear vector spaces and their duals, making them more structurally complex than matrices. For practical applications, especially in fields like physics, understanding these differences is crucial for proper representation and manipulation of mathematical entities.
Superposed_Cat
Messages
388
Reaction score
5
They look a lot like matrices, and seem to work exactly like matrices. What is the difference between them? I have only worked with matrices, not tensors because I can't find a tutorial online but every time I have seen one they seem identical.
 
Physics news on Phys.org
Rank 2 tensors can be represented by square matrices, but this does not make a tensor a matrix or vice versa. Tensors have very specific transformation properties when changing coordinates (in the case of Cartesian tensors, rotations).

However, all tensors are not rank 2 and those that are not cannot be represented as a matrix (you would have to use a matrix with more than 2 dimensions). Also, not all matrices are tensors. There are non-square matrices, matrices not transforming in the proper way (a matrix is a priori only a rectangular array of numbers) to represent a tensor, etc. For many applications, you will only encounter tensors of rank 2 or lower and then representation with matrices is very convenient.
 
  • Like
Likes FactChecker
The matrix is a mathematical concept that does not have to transform when coordinates change the way a physical entity would. A tensor is a concept that must transform to new coordinates the way a physical entity would.
Example: The identity matrix is a diagonal matrix of 1's. If the coordinate system is in feet or inches, the diagonals are still 1's. So the identity matrix is a math concept that does not transform correctly (from coordinates of feet to coordinates of inches) to represent a physical entity. For the same matrix to represent a tensor, it would have to be defined in a way that its diagonal 1's in the coordinates of feet would transform to either 12's or 1/12's diagonal elements in coordinates of inches (there are covarient and contravarient tensors)
 
  • Like
Likes Chris LaFave
FactChecker said:
Example: The identity matrix is a diagonal matrix of 1's. If the coordinate system is in feet or inches, the diagonals are still 1's. So the identity matrix is a math concept that does not transform correctly (from coordinates of feet to coordinates of inches) to represent a physical entity. For the same matrix to represent a tensor, it would have to be defined in a way that its diagonal 1's in the coordinates of feet would transform to either 12's or 1/12's diagonal elements in coordinates of inches (there are covarient and contravarient tensors)

While I agree that transformation properties of tensors are important, I think the unit matrix is not a very illuminating (and somewhat misleading) example. In particular, consider the (1,1)-tensor ##\delta^\alpha_\beta## such that ##\delta^\alpha_\beta V^\beta = V^\alpha##, where ##V## is a vector. This tensor will be represented by the unit matrix in all frames (the unit matrix is a transformation from the vector space of column matrices to itself and therefore naturally represents a (1,1)-tensor, you can fiddle around to make a square matrix represent an arbitrary rank-2 tensor, but I would say it is slightly less natural). The tensor transformation properties follow trivially from the chain rule.
 
Orodruin said:
While I agree that transformation properties of tensors are important, I think the unit matrix is not a very illuminating (and somewhat misleading) example. In particular, consider the (1,1)-tensor ##\delta^\alpha_\beta## such that ##\delta^\alpha_\beta V^\beta = V^\alpha##, where ##V## is a vector. This tensor will be represented by the unit matrix in all frames (the unit matrix is a transformation from the vector space of column matrices to itself and therefore naturally represents a (1,1)-tensor, you can fiddle around to make a square matrix represent an arbitrary rank-2 tensor, but I would say it is slightly less natural). The tensor transformation properties follow trivially from the chain rule.
Ok. I retract my statement and will stay out of this discussion.
 
Usually tensors are associated with a linear vector space ##V## and its dual space ##V^*##. A tensor of rank ##(p,q)## is then a multilinear function from ##p## copies of ##V## and ##q## copies of ##V^*## to some scalar field (usually ##\mathbb{R}## or ##\mathbb{C}##). In this sense, a tensor is an element of ##V^{*p}\otimes V^{**q}##, where ##V^{**}## is the space of all linear functionals on ##V^{*}##.

When ##V## is finite dimensional, ##V^{**}=V##, and a rank ##(p,q)## tensor is in ##V^{*p}\otimes V^{q}##. A linear transformation from ##V## to itself can be represented by an element ##\omega \in V\otimes V^*##. If we pick bases ##\epsilon_j## for ##V## and ##\epsilon_k^*## for ##V^*## (with ##\epsilon_k^*(\epsilon_j)=\delta_{kj}##), then we can expand ##\omega## as ##\omega = \sum_{j,k} \omega_{jk}\epsilon_j\otimes \epsilon_k^*##, and the components ##\omega_{jk}## can be interpreted as elements of a matrix.

Matrices have a different kind of structure from tensors. While matrices can be used to represent tensors in a wide range of settings, matrix multiplication (say between square matrices) is only meaningful in the tensor context when the tensors have the form ##V\otimes V^{*}## for some vector space ##V##, or when there is a linear map between ##V## and ##V^*## (i.e. an inner product or metric) and ##p+q## is even (or if you consider multiplication between special families of tensors). Tensors have more structure than matrices, but questions about matrices have a very different flavor from questions about tensors.
 
Can anyone link me a tutorial on tensors?
 
What do you want to use tensors for? (e.g. general relativity, quantum mechanics, engineering/materials science, information theory/statistics)
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K