What are the differences between matrices and tensors?

In summary, a tensor is a geometric entity that is not just a matrix, and tensors have more structure than matrices. In order to use matrices to represent a tensor, you need to remove the overlap between the spaces first.
  • #1
Sorcerer
281
52
I have not really finished studying linear algebra, I have to admit. The furthest I have gotten to is manipulating matrices a little bit (although I have used this in differential equations to calculate a Wronskian to see if two equations are linear independent, but again, a determinant is pretty basic). But I tried looking at tensors, and I am having a hard time distinguishing between a matrix and a tensor. What are the differences other than Einstein summation convention? I know I have to just hit the grind stone and finish learning the basics of linear algebra, but hopefully someone can enlighten me a bit about the differences.
 
Physics news on Phys.org
  • #3
Sorcerer said:
I have not really finished studying linear algebra, I have to admit. The furthest I have gotten to is manipulating matrices a little bit (although I have used this in differential equations to calculate a Wronskian to see if two equations are linear independent, but again, a determinant is pretty basic). But I tried looking at tensors, and I am having a hard time distinguishing between a matrix and a tensor. What are the differences other than Einstein summation convention? I know I have to just hit the grind stone and finish learning the basics of linear algebra, but hopefully someone can enlighten me a bit about the differences.
The components of 2-index tensors with respect to a basis can be described by a matrix. But a tensor is a geometric object, independent from a chosen basis. A matrix is just a way to describe it with respect to some basis.

Likewise, a vector is not just a set of numbers, e.g. {1,2,3}. This set, at most, describes the components of a vector with respect to some basis.

So the lesson is that 2-index tensors have more structures than the matrices one uses to describe its components.

For tensors with more indices you cannot use matrices anymore to depict their components.
 
  • Like
Likes FactChecker
  • #4
In addition to what has been said above. Before you worry about tensors you need to grasp fully

a) the difference between a vector and the representation of a vector as three numbers in a given basis

b) the difference between a linear operator and the representation of a linear operator as a matrix in a particular basis

For physics, you also need to grasp the difference between these as physical things and mathematical objects.
 
Last edited:
  • Like
Likes FactChecker and fresh_42
  • #5
If you define a tensor and represent it in two different basis by its elements, then the two representations are related by the tensor transformation rules. That is what makes it a physical or geometric entity rather than just a mathematical entity.

A mathematical entity that is not a tensor: Define the ordered pair (1,2), regardless of what that means in any basis. So it corresponds to the vector (1,2) no matter what the basis is. That is a mathematical entity does not transform correctly to be a tensor. In fact, it does not transform at all. People often define the unit basis (0,1) and (1,0) for any basis. That concept is not a intrinsically a tensor and does not have a fixed physical or geometrical meaning. But it can be used to represent the associated tensor (vector) that will follow the tensor transformation rules.
 
Last edited:
  • #6
Hey Sorcerer

Building on what haushofer has mentioned about being independent of basis, a tensor is constructed by having spaces being "dual" to another [i.e. being independent] so that they share nothing in common with one another.

The "overlap" part of a tensor is literally nothing [i.e. the zero vector in the spaces] and it is like a Cartesian product of two sets except that you need to preserve the geometry of the space [which means you need to preserve the axioms of the vector space including a metric and inner product space if necessary] and synthesize a new space from the building blocks of the original ones.

When they have some overlap between them, then you need to make them dual by removing the stuff common to both spaces and then take the tensor product.

It's a bit like with P(A OR B) = P(A) + P(B) - P(A and B) with probability - in this analogy between probability and tensors, the overlap term is P(A and B) and provided it is zero, then you get a similar result to tensors where you do the tensor product to get the new space.

If the overlap term is non-zero [i.e. non-zero vectors exist in both spaces] then you have to "transform" one space so that you remove the overlap and then do the tensor product.

The reason they are used is because you use the theory of tensors to build more complicated spaces and the mathematical results are guaranteed to hold if the dual holds where both are independent from one another [in terms of the information they contain]. This means physicists, engineers, and applied mathematicians just use these results knowing they work if they are independent and have geometry much like how they use results from calculus or linear algebra.
 

1. What is the difference between tensors and linear algebra?

While both tensors and linear algebra are branches of mathematics that deal with multi-dimensional objects, the main difference is that tensors are more general and can represent higher dimensional objects, while linear algebra is limited to two-dimensional matrices. Tensors also have additional properties and operations that make them useful for specific applications in physics and engineering.

2. How are tensors and linear algebra related?

Linear algebra is a fundamental tool used to study tensors, as tensors are represented as multi-dimensional arrays or matrices. Tensors also follow many of the same principles and operations as linear algebra, such as addition, multiplication, and inversion. However, tensors have additional properties and operations that make them more complex than linear algebra.

3. What are some applications of tensors in real life?

Tensors have a wide range of applications in physics, engineering, and computer science. They are used in fields such as mechanics, electromagnetism, fluid dynamics, and computer vision. Some specific examples include using tensors to represent stress and strain in materials, describing the deformation of objects, and analyzing images and videos.

4. Is it necessary to have a strong background in linear algebra to understand tensors?

While a basic understanding of linear algebra is helpful, it is not necessary to have a strong background in order to understand tensors. Tensors have their own set of rules and properties that can be learned independently. However, having a strong foundation in linear algebra can make it easier to understand and apply tensor operations.

5. Are there any software tools available for working with tensors?

Yes, there are many software tools available for working with tensors, such as MATLAB, Python libraries like TensorFlow and PyTorch, and Mathematica. These tools provide efficient and easy-to-use methods for performing tensor operations and visualizing tensors in different dimensions.

Similar threads

  • Linear and Abstract Algebra
Replies
10
Views
364
  • Linear and Abstract Algebra
Replies
7
Views
255
  • Linear and Abstract Algebra
Replies
1
Views
827
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
19
Views
2K
Replies
6
Views
1K
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
11K
Back
Top