Difference between Tensors and matrices

• Superposed_Cat
In summary, tensors and matrices may look similar and have some overlapping properties, but they are fundamentally different mathematical objects. Unlike matrices, tensors have specific transformation properties when changing coordinates and can represent a wider range of mathematical concepts. While matrices can be used to represent tensors in certain cases, they do not have the same level of structure and are limited in their applications. Tutorials on tensors are available depending on the intended use, such as in general relativity, quantum mechanics, engineering, and information theory.
Superposed_Cat
They look a lot like matrices, and seem to work exactly like matrices. What is the difference between them? I have only worked with matrices, not tensors because I can't find a tutorial online but every time I have seen one they seem identical.

Rank 2 tensors can be represented by square matrices, but this does not make a tensor a matrix or vice versa. Tensors have very specific transformation properties when changing coordinates (in the case of Cartesian tensors, rotations).

However, all tensors are not rank 2 and those that are not cannot be represented as a matrix (you would have to use a matrix with more than 2 dimensions). Also, not all matrices are tensors. There are non-square matrices, matrices not transforming in the proper way (a matrix is a priori only a rectangular array of numbers) to represent a tensor, etc. For many applications, you will only encounter tensors of rank 2 or lower and then representation with matrices is very convenient.

FactChecker
The matrix is a mathematical concept that does not have to transform when coordinates change the way a physical entity would. A tensor is a concept that must transform to new coordinates the way a physical entity would.
Example: The identity matrix is a diagonal matrix of 1's. If the coordinate system is in feet or inches, the diagonals are still 1's. So the identity matrix is a math concept that does not transform correctly (from coordinates of feet to coordinates of inches) to represent a physical entity. For the same matrix to represent a tensor, it would have to be defined in a way that its diagonal 1's in the coordinates of feet would transform to either 12's or 1/12's diagonal elements in coordinates of inches (there are covarient and contravarient tensors)

Chris LaFave
FactChecker said:
Example: The identity matrix is a diagonal matrix of 1's. If the coordinate system is in feet or inches, the diagonals are still 1's. So the identity matrix is a math concept that does not transform correctly (from coordinates of feet to coordinates of inches) to represent a physical entity. For the same matrix to represent a tensor, it would have to be defined in a way that its diagonal 1's in the coordinates of feet would transform to either 12's or 1/12's diagonal elements in coordinates of inches (there are covarient and contravarient tensors)

While I agree that transformation properties of tensors are important, I think the unit matrix is not a very illuminating (and somewhat misleading) example. In particular, consider the (1,1)-tensor ##\delta^\alpha_\beta## such that ##\delta^\alpha_\beta V^\beta = V^\alpha##, where ##V## is a vector. This tensor will be represented by the unit matrix in all frames (the unit matrix is a transformation from the vector space of column matrices to itself and therefore naturally represents a (1,1)-tensor, you can fiddle around to make a square matrix represent an arbitrary rank-2 tensor, but I would say it is slightly less natural). The tensor transformation properties follow trivially from the chain rule.

Orodruin said:
While I agree that transformation properties of tensors are important, I think the unit matrix is not a very illuminating (and somewhat misleading) example. In particular, consider the (1,1)-tensor ##\delta^\alpha_\beta## such that ##\delta^\alpha_\beta V^\beta = V^\alpha##, where ##V## is a vector. This tensor will be represented by the unit matrix in all frames (the unit matrix is a transformation from the vector space of column matrices to itself and therefore naturally represents a (1,1)-tensor, you can fiddle around to make a square matrix represent an arbitrary rank-2 tensor, but I would say it is slightly less natural). The tensor transformation properties follow trivially from the chain rule.
Ok. I retract my statement and will stay out of this discussion.

Usually tensors are associated with a linear vector space ##V## and its dual space ##V^*##. A tensor of rank ##(p,q)## is then a multilinear function from ##p## copies of ##V## and ##q## copies of ##V^*## to some scalar field (usually ##\mathbb{R}## or ##\mathbb{C}##). In this sense, a tensor is an element of ##V^{*p}\otimes V^{**q}##, where ##V^{**}## is the space of all linear functionals on ##V^{*}##.

When ##V## is finite dimensional, ##V^{**}=V##, and a rank ##(p,q)## tensor is in ##V^{*p}\otimes V^{q}##. A linear transformation from ##V## to itself can be represented by an element ##\omega \in V\otimes V^*##. If we pick bases ##\epsilon_j## for ##V## and ##\epsilon_k^*## for ##V^*## (with ##\epsilon_k^*(\epsilon_j)=\delta_{kj}##), then we can expand ##\omega## as ##\omega = \sum_{j,k} \omega_{jk}\epsilon_j\otimes \epsilon_k^*##, and the components ##\omega_{jk}## can be interpreted as elements of a matrix.

Matrices have a different kind of structure from tensors. While matrices can be used to represent tensors in a wide range of settings, matrix multiplication (say between square matrices) is only meaningful in the tensor context when the tensors have the form ##V\otimes V^{*}## for some vector space ##V##, or when there is a linear map between ##V## and ##V^*## (i.e. an inner product or metric) and ##p+q## is even (or if you consider multiplication between special families of tensors). Tensors have more structure than matrices, but questions about matrices have a very different flavor from questions about tensors.

Can anyone link me a tutorial on tensors?

What do you want to use tensors for? (e.g. general relativity, quantum mechanics, engineering/materials science, information theory/statistics)

1. What is the difference between tensors and matrices?

Tensors and matrices are both mathematical objects used to represent and manipulate data. However, tensors are more general than matrices and can have multiple dimensions, whereas matrices are limited to two dimensions.

2. How are tensors and matrices used in scientific research?

Tensors and matrices are essential tools in various fields of science, such as physics, engineering, and computer science. They are used to model and analyze complex systems and data, perform calculations, and solve problems.

3. Can tensors and matrices be interchanged in calculations?

No, tensors and matrices have distinct mathematical properties, and they cannot be used interchangeably in calculations. Tensors have a more complex algebraic structure and require specific operations and techniques for manipulation.

4. Are there any real-world applications of tensors and matrices?

Yes, tensors and matrices have numerous real-world applications, including image and signal processing, machine learning, and quantum mechanics. They are used to analyze and interpret data, make predictions, and solve optimization problems.

5. What is the significance of tensors and matrices in machine learning?

Tensors and matrices are fundamental concepts in machine learning algorithms. They are used to represent and process data, such as images, text, and audio, and to train models to make predictions and classifications.

• Linear and Abstract Algebra
Replies
5
Views
2K
• Linear and Abstract Algebra
Replies
2
Views
907
• Linear and Abstract Algebra
Replies
2
Views
557
• Linear and Abstract Algebra
Replies
5
Views
1K
• Linear and Abstract Algebra
Replies
1
Views
3K
• Topology and Analysis
Replies
26
Views
3K
• Linear and Abstract Algebra
Replies
8
Views
1K
• Linear and Abstract Algebra
Replies
6
Views
1K
• Precalculus Mathematics Homework Help
Replies
1
Views
698
• Programming and Computer Science
Replies
1
Views
781