Can Tensors of Any Rank Be Approximated by Matrices?

At its core, a tensor is a mathematical object that describes a relationship between geometric vectors, scalars, and other tensors. It is a generalization of vectors and matrices to higher dimensions and allows for the representation of complex and multidimensional data. Tensors have a rich and diverse set of applications in fields such as physics, engineering, and data analysis. They are an essential tool in understanding and solving complex problems, and their versatility makes them a cornerstone of modern mathematics.
  • #1
trees and plants
Hello. Could we approximate a tensor of (p,q) rank with matrices of their elements? I am talking also about the general case of a tensor not only special cases. For example a (2,0) tensor with i, j indices is a matrix of ixj indices. A (3,0) tensor with i,j,k indices I think is k matrices with ixj elements each one of them. So is a tensor of (p,q) rank with upper indices i1,i2,...,ip and lower indices j1,j2,...,jq a set of (i1xi2...xip)x(j1xj2...xjq-2) matrices of jq-1xjq elements? If I am incorrect what did I do wrong? Thank you.
 
Last edited by a moderator:
  • Like
Likes Delta2
Physics news on Phys.org
  • #2
What do you mean approximate a tensor?

A tensor of rank 0 is a scalar.

A tensor of rank 1 is a vector that could be viewed as an ordered set of scalars.

A tensor of rank 2 is a matrix that could be viewed as an ordered set of column vectors.

A tensor of rank 3 is a tensor (no particular name) that could be viewed as an ordered set of matrices.

However, thinking this way diminishes the full import of tensors. Tensors have subscript and superscript notation that represent covariant and contravariant attributes that are useful in differential geometry among other things.

https://en.wikipedia.org/wiki/Tensor




...
 
  • Like
Likes Mayhem, Delta2 and FactChecker
  • #3
jedishrfu said:
A tensor of rank 3 is a tensor (no particular name) that could be viewed as an ordered set of matrices.
You can consider such a tensor as bilinear multiplication. See Strassen's algorithm for example:
https://www.physicsforums.com/insights/what-is-a-tensor/
 
  • Like
Likes jedishrfu
  • #4
jedishrfu said:
A tensor of rank 1 is a vector that could be viewed as an ordered set of scalars.

Tiny quibble... a scalar is, like you point out, a rank-0 tensor, and is by definition coordinate independent. So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
 
  • Like
  • Haha
Likes Stephen Tashi and jedishrfu
  • #5
It is not necessary to approximate a tensor as i say but it may be a goal one has. But other goals are needed as i read about differential geometry and the use of tensors, they study it differently than i say. What i mean by approximation is that the tensor may be represented as the times i said before matrices. Having the components of those matrices will help solve my question in the general case. I think the components and matrices of a (p,q) rank tensor are so many that we need more papers or space to put them. Perhaps my question is wrong.
 
  • #6
etotheipi said:
Tiny quibble... a scalar is, like you point out, a rank-0 tensor, and is by definition coordinate independent. So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
They can not be functions? The vector with respect to a basis can not be like f=(f1,...,fn) where fi, ∀ i=1,..,n are functions of m variables where xj are those ∀ j=1,...,m with xj∈ℝ? They do not obey the transformation law? That is why?
 
  • #7
universe function said:
They can not be functions? The vector with respect to a basis can not be like f=(f1,...,fn) where fi, ∀ i=1,..,n are functions of m variables where xj are those ∀ j=1,...,m with xj∈ℝ? They do not obey the transformation law? That is why?
They can and they are. Take a derivative as example: ##f(x)=3x^2## and ##\left. \dfrac{d}{dx}\right|_{x=5}f(x)=30##. This is a number, a scalar, and a linear function, or even a covariant derivative. The latter two are the usual interpretation in differential geometry and physics. While the number ##30## is the slope of the tangent at ##x=5## at school, it becomes the linear function: ##30=D_5(f)\, : \,v\longrightarrow 30\cdot v## in differential geometry. This might be a bit confusing, but it is the one dimensional version of what we call e.g. a curvature tensor. So whether the value ##30## is considered a slope, a number, a scalar or a linear function depends on whom you ask, will say: the context.
 
  • Like
Likes jedishrfu
  • #8
universe function said:
They can not be functions? The vector with respect to a basis can not be like f=(f1,...,fn) where fi, ∀ i=1,..,n are functions of m variables where xj are those ∀ j=1,...,m with xj∈ℝ? They do not obey the transformation law? That is why?
If one has vector fields ##X_{i}## in some neighborhood that form a basis in each tangent space then any vector field in this neighborhood can be expressed as a linear combination of these basis fields. The coefficients of this linear combination will be functions on the neighborhood. So if ##V=∑_{i}a_{i}X_{i}## then the ##a_{i}## are functions(scalars). If one changes basis ##V## will be written using different functions. The same thing holds for any tensor.

If one has a vector field ##V## and a 1 form ##ω## then the value ##ω(V)## of ##ω## on ##V## is a function(scalar). Given the basis ##X_{i}## one has an n-tuple of 1 forms ##ω_{i}## that give the ##i##'th coefficient of any vector field in its linear expansion with respect to the basis fields. So for an arbitrary vector ##V##, ##V=∑_{i}ω_{i}(V)X_{i}##. The ##ω_{i}(V)## are functions for each vector field ##V##.
 
Last edited:
  • #9
jedishrfu said:
What do you mean approximate a tensor?

A tensor of rank 0 is a scalar.

A tensor of rank 1 is a vector that could be viewed as an ordered set of scalars.

A tensor of rank 2 is a matrix that could be viewed as an ordered set of column vectors.

A tensor of rank 3 is a tensor (no particular name) that could be viewed as an ordered set of matrices.

However, thinking this way diminishes the full import of tensors. Tensors have subscript and superscript notation that represent covariant and contravariant attributes that are useful in differential geometry among other things.

https://en.wikipedia.org/wiki/Tensor




...

"Tensor is a thing that transform like a tensor" is definitely my favorite definition
 
  • Like
Likes PhDeezNutz and jedishrfu
  • #10
Yeah, recursive definitions often leave oneself in the dust.
 
  • #11
etotheipi said:
So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].

Yes, but human endeavors permit contradictions, so an elementary physics text may speak of "resolving a vector into components" and draw the components as vectors. For example, the gravitational force on a mass resting on inclined plane can be represented a "component" perpendicular to the surface of the plane and a "component" parallel to the surface of the plane.
 
  • Like
Likes jedishrfu
  • #12
etotheipi said:
Tiny quibble... a scalar is, like you point out, a rank-0 tensor, and is by definition coordinate independent. So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
Actually the components are scalars. When you transform the tensor to a new coordinate system the new components are different scalars. Scalars are just functions defined on the manifold. Components of a vector field in a given coordinate system are functions.

For instance the components of a vector field(contravariant tensor of rank 1) in a given coordinate system form a n-tuple of functions on the coordinate domain. The most primitive examples are the coordinate functions ##x_{i}## themselves which are the projections of the coordinate chart onto the standard axes in Euclidean space.
 
Last edited:
  • Like
Likes etotheipi
  • #13
I thought, that ##\phi## is a scalar if for any two coordinate systems ##C## and ##C'##, $$\phi'(x') = \phi(x)$$that means, a scalar is invariant under a change of coordinate system. That definition is not satisfied by the components of a vector with respect to some basis, though.
 
  • #14
etotheipi said:
I thought, that ##\phi## is a scalar if for any two coordinate systems ##C## and ##C'##, $$\phi'(x') = \phi(x)$$that means, a scalar is invariant under a change of coordinate system. That definition is not satisfied by the components of a vector with respect to some basis, though.
The components in any given coordinate system are scalars. When one changes coordinates one gets a new set of scalars. If one considers all components in all coordinate systems together as a single object then one has the underlying tensor. But that is different.
 
  • Like
Likes etotheipi
  • #15
I think I see what you mean, thanks.
 
  • #16
etotheipi said:
I think I see what you mean, thanks. I delete!
No need to delete. Your comment leads to clarification of ideas. The index definition of tensors is not easy to master.
 
  • Like
Likes fresh_42
  • #17
universe function said:
Hello. Could we approximate a tensor of (p,q) rank with matrices of their elements? I am talking also about the general case of a tensor not only special cases. For example a (2,0) tensor with i, j indices is a matrix of ixj indices. A (3,0) tensor with i,j,k indices I think is k matrice
Yes you can consider tensor by means of matrices
that is completely useless by itself for understanding what tensor is
 
  • Like
Likes etotheipi and LCSphysicist
  • #18
wrobel said:
Yes you can consider tensor by means of matrices
that is completely useless by itself for understanding what tensor is
Could someone answer the following?Why do we use tensors without referring to matrices and vectors and their theories? Could a tensor be generalised to another object and what would that generalisation allow in mathematics and in physics?
 
  • #19
universe function said:
Could someone answer the following? Why do we use tensors without referring to matrices and vectors and their theories?
Because they are something different. Matrices and vectors are certain tensors, but not vice versa. A tensor can be seen as a multilinear function, in the same sense as a vector can be seen as a linear function and a matrix as a bilinear function.
Could a tensor be generalised to another object ...
What does that mean? A tensor is already a generalization in the sense that it has the so called universal property.
... and what would that generalisation allow in mathematics and in physics?
It allows to consider multilinear structures such as e.g. Graßman algebras or Lie algebras as quotient space of tensor algebras.
 
  • #20
universe function said:
Could someone answer the following?Why do we use tensors without referring to matrices and vectors and their theories? Could a tensor be generalised to another object and what would that generalisation allow in mathematics and in physics?

Maybe I misunderstand the question but the matrix representations of a tensor I believe just help to do actual calculations given a choice of coordinates. For example, you can construct a homomorphism from the group of rotations in 3D space to the group of 3x3 orthogonal matrices w/ unit determinant, e.g. something like ##R^{\mathbf{n}}_{\theta} \mapsto \rho(R^{\mathbf{n}}_{\theta})## requiring that ##\rho(R^{\mathbf{n}}_{\theta_1}) \rho(R^{\mathbf{n}}_{\theta_2}) = \rho(R^{\mathbf{n}}_{\theta_1 + \theta_2})##, so now you can instead work with the coordinate representations of the vectors in ##V##.

Nonetheless the original abstract tensor ##R^{\mathbf{n}}_{\theta}## is still just a rank 2 tensor which can be viewed naturally as a map from ##V \rightarrow V## (or instead ##V \times V \rightarrow \mathbb{R}##), which has nothing to do with matrices a priori.
 
  • #21
etotheipi said:
Nonetheless the original abstract tensor ##R^{\mathbf{n}}_{\theta}## is still just a rank 2 tensor which can be viewed naturally as a map from ##V \rightarrow V## (or instead ##V \times V \rightarrow \mathbb{R}##), which has nothing to do with matrices a priori.
So if we have a rank 4 tensor ##R_{ijkl}## (like the Riemann curvature tensor), it can be viewed as a map from ##V\times V \times V \rightarrow V## or ##V \times V \times V \times V \rightarrow \mathbb{R}##) ?In this case (sorry for the question) what is V?
 
  • #22
universe function said:
So if we have a rank 4 tensor ##R_{ijkl}## (like the Riemann curvature tensor), it can be viewed as a map from ##V\times V \times V \rightarrow V## or ##V \times V \times V \times V \rightarrow \mathbb{R}##) ?In this case (sorry for the question) what is V?
A vector space. And it can also be seen as with dual vector spaces ##V^*## instead. It all depends on the context and the purpose.
 
  • #23
fresh_42 said:
A vector space. And it can also be seen as with dual vector spaces ##V^*## instead. It all depends on the context and the purpose.
This means that a Riemannian manifold is a vector space and a topological space? Is it also a group? So it is a Lie group?
 
  • #24
universe function said:
This means that a Riemannian manifold is a vector space ...
No.
... and a topological space?...
Yes.
Is it also a group? ...
Normally not, but it can be a group, e.g. ...
... So it is a Lie group?
Yes. Every Lie group is a differentiable manifold, but not vice versa.
 
  • #25
So if we could generalise the multilinearity and axioms of vector space in order to generalise we could get a generalisation of Riemannian manifolds or differentiable manifolds?What else would allow in that structure for calculus to be applied?Or is it just that ok?
 
  • #26
why do not you read a textbook?
for example
Lectures on Differential Geometry
Shlomo Sternberg
 
  • #27
Another question i have is what do tensor relations, like equations express? Mathematically or physically. Could someone explain?
 

1. What is tensor analysis?

Tensor analysis is a mathematical tool used to study and analyze the properties of objects and systems in multiple dimensions. It involves the use of tensors, which are mathematical objects that describe how different quantities change in relation to each other.

2. What are the applications of tensor analysis?

Tensor analysis has a wide range of applications in fields such as physics, engineering, and computer science. It is used to study the behavior of physical systems, analyze data in machine learning, and solve problems in fluid dynamics and electromagnetism, among others.

3. How is tensor analysis different from vector analysis?

Tensor analysis is an extension of vector analysis to multiple dimensions. While vector analysis deals with quantities that have magnitude and direction, tensor analysis deals with quantities that have multiple components and can vary in different directions.

4. What are the different types of tensors?

There are three main types of tensors: scalars, vectors, and higher-order tensors. Scalars have only one component, while vectors have multiple components and can be represented by arrows. Higher-order tensors have more than two indices and can represent more complex quantities.

5. How is tensor analysis used in general relativity?

Tensor analysis is a crucial tool in the study of general relativity, which is a theory of gravity that describes the behavior of objects in space and time. Tensors are used to represent the curvature of space-time and the distribution of matter and energy, allowing us to understand the effects of gravity on the universe.

Similar threads

Replies
4
Views
1K
  • Science and Math Textbooks
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
Replies
2
Views
1K
  • Classical Physics
Replies
30
Views
2K
  • Special and General Relativity
Replies
5
Views
750
Replies
9
Views
3K
  • Advanced Physics Homework Help
Replies
8
Views
1K
Replies
2
Views
1K
Back
Top