Can a Tensor be Represented by a 3-Dimensional Matrix?

  • I
  • Thread starter kent davidge
  • Start date
  • Tags
    Tensors
In summary: In fact, they're all tensors of the same rank, but with different structure. So to say that a tensor of rank 2 is a matrix is not really helpful.
  • #1
kent davidge
933
56
Question 1 - I know a tensor is not a matrix. But the values of each component of a tensor of the form Aμ1μ2 can be arranged in exactly the same way as in a usual 2-dimensional matrix. I was wondering if it would be possible to represent a Aμ1μ2μ3 tensor by a 3-dimensional matrix, and likewise (although it can not be visualized) a Aμ1...μ tensor by a ∞-dimensional matrix.

Question 2 - Now, I've never seen in my linear algebra course the entries of a matrix A be represented as Aλρ. So how would it look like if we wish, as in "Question 1", to represent the components Aλρ of a (1,1) tensor by a matrix?

A one more question: is four the max number of lower indices that a tensor can have in GR?

(Sorry for my poor English.)
 
Last edited:
Physics news on Phys.org
  • #2
kent davidge said:
Question 1 - I know a tensor is not a matrix. But the values of each component of a tensor of the form Aμ1μ2 can be arranged in exactly the same way as in a usual 2-dimensional matrix. I was wondering if it would be possible to represent a Aμ1μ2μ3 tensor as a 3-dimensional matrix, and likewise (although it can not be visualized) a Aμ1...μ tensor as a ∞-dimensional matrix.

Question 2 - Now I've never seen in my linear algebra courses the entries of a matrix A be represented as Aλρ. So how would it look like if we wish, as in "Question 1", to represent the components Aλρ of a tensor as a matrix?

(Sorry for my poor English.)
The short answer is yes: a matrix is a tensor, however, as you've said, not the other way around.
An index ##_\infty## is problematic. One probably wouldn't use coordinates to deal with tensors of infinitely generated modules or vector spaces. Or at least not with all of them at a time.

Since you started with coordinates and leveled the question as "I", my long answer is as follows:
A tensor of rank ##0## is a scalar. A number of the underlying ring or field ##\mathbb{F}##, e.g. ##\mathbb{R}## or ##\mathbb{C}##.
A tensor of rank ##1## is a vector.
A tensor of rank ##2## is a matrix.
A tensor of rank ##3## is a cube.
etc.

If we consider for instance matrices, then they can be written as ##\sum_{i,j=1}^n A_{ij} \vec{e}_{ij}## where ##A_{ij}## denote the matrix elements and the ##\vec{e}_{ij}## the basis matrices with a ##1## at position ##(i,j)## and ##0## elsewhere.
Now every ##\vec{e}_{ij}## can be written as a tensor ##\vec{e}_{ij}=\vec{e}_{j} \otimes \vec{e}_{i} = \vec{e}_{j} \,\cdot\, \vec{e}_{i}^{\tau}##. Here I wrote ##\vec{e}_j = (0, \dots , 1, \dots , 0)## as a row vector.
Sometimes generic products ##v_1 \otimes \dots \otimes v_k## are referred to as tensors, but rigorously all linear combinations of these are tensors of rank ##k## in this case. As the equation with the matrix shows, a rank ##2## tensor ##v_1 \otimes v_2## is a matrix of matrix rank ##1##. To get all matrices, and therewith all tensors of rank ##2##, one has to allow all linear combinations.

This procedure can be done on every (finite) rank ##k##.
The tensor space or better tensor algebra itself is then the sum of all these, i.e. on a vector space ##V## it is ##\mathcal{T}(V)=\mathbb{F} \oplus V \oplus (V \otimes V) \oplus (V \otimes V \otimes V) \oplus \dots##

One has to be a little bit careful here, since a tensor product of two vectorspaces ##U## and ##V## is indeed only the linear span of all elements ##u \otimes v## which can be viewed as a matrix, i.e. a tensor of rank ##2##. The representation as such a linear combination by the way isn't unique in general, since e.g. ##r \cdot u \otimes v = u \otimes r \cdot v##.
Also other tensors of fixed rank ##k## could be referred to as tensor product ##V_1 \otimes \dots \otimes V_k##.

In the context of physics there has also to be mentioned, that some of these (or all) vector spaces might as well be vector spaces of linear functions.
 
Last edited:
  • Like
Likes kent davidge
  • #3
I'd say 2-index tensor components can be represented by a matrix, but a matrix is not necessarily a tensor.
 
  • Like
Likes vanhees71
  • #4
Hi.

As for Q2 I would say yes and matrices corresponding to $$A_{\lambda\rho},A^{\lambda\rho},A^\lambda_{\ \rho},A_{\lambda}^{\ \rho}$$ are all different. ##\lambda## ={0,1,2,3} or {1,2,3,4} in some textbook.

 
Last edited:
  • #5
I think it's very tempting, but rather unhelpful, to think of tensors as some kind of specialised matrix. I know I came away from my first GR lectures with that idea.

Certainly you can write the components of a tensor in some coordinate system in matrix notation, and that's a perfectly sensible way to write out the components of a two-or-fewer index tensor, because the organisation is helpful. But it gets complicated quickly if you want to carry it too far. For example, you asked about two index tensors - there are three kinds (2,0), (1,1) and (0,2), and all look the same in matrix notation:
$$A_{\mu\nu}=\left(\begin{array}{cccc}
A_{00}&A_{01}&A_{02}&A_{03}\\
A_{10}&A_{11}&A_{12}&A_{13}\\
A_{20}&A_{21}&A_{22}&A_{23}\\
A_{30}&A_{31}&A_{32}&A_{33}\end{array}\right),
A_{\mu}{}^{\nu}=\left(\begin{array}{cccc}
A_{0}{}^{0}&A_{0}{}^{1}&A_{0}{}^{2}&A_{0}{}^{3}\\
A_{1}{}^{0}&A_{1}{}^{1}&A_{1}{}^{2}&A_{1}{}^{3}\\
A_{2}{}^{0}&A_{2}{}^{1}&A_{2}{}^{2}&A_{2}{}^{3}\\
A_{3}{}^{0}&A_{3}{}^{1}&A_{3}{}^{2}&A_{3}{}^{3}\end{array}\right),
A^{\mu\nu}=\left(\begin{array}{cccc}
A^{00}&A^{01}&A^{02}&A^{03}\\
A^{10}&A^{11}&A^{12}&A^{13}\\
A^{20}&A^{21}&A^{22}&A^{23}\\
A^{30}&A^{31}&A^{32}&A^{33}\end{array}\right)$$Certainly the values in corresponding elements are different (in general, ##A_{\mu\nu}\neq A_\mu{}^\nu\neq A^{\mu\nu}##), but there's no way to tell from the shape of the matrix what is the rank of the tensor. You can't even reliably guess it from the units when people suppress factors of c.

Also, there are restrictions on tensors that one doesn't have with matrices. For example, a one-index tensor is representable as a row or column matrix. The rules of matrix multiplication then have nothing against the notion of forming ##\vec{U}\vec{V}=\sum_\mu U^\mu V^\mu##, as long as we write ##U^\mu## as a row vector and ##V^\mu## as a column vector. But that's illegal with tensors. You can try to come up with rules, like "always write vectors as columns and one-forms as rows", in which case ##U_\mu V^\mu## works. But then, why does ##U^\mu g_{\mu\nu}V^\nu## work? U needs to be a row for the multiplication with g to work - so I already have to develop ad hoc rules and I've only got two expressions...

So yes, I think you can perfectly well write out th elements of an (n,m) tensor as an (n+m) dimensional grid. The organisation is natural and symmetries are clearly visible (when n+m=2, anyway). But I don't think there's a way to identify n and m in that notation, just their sum, and they don't behave like matrices in general.

I haven't seen more than a 4-index tensor. I'm not well read enough in the field to say authoritatively that no one has ever used such a thing but, if not, it's more the case that no one has ever needed to rather than it's not allowed. The Riemann tensor has four indices - one for the output vector and one each for the three input vectors (the arbitrary vector and the two defining the loop you move the input vector around). If you find an application (no idea what) that needs four inputs and one output, you'll find a five index tensor. Note: it's an unfortunate coincidence that the Riemann has four indices and spacetime has four dimensions - as I understand it the facts are unrelated.
 
  • Like
Likes kent davidge
  • #6
Thanks for the responses.

I think I got the answers I was searching for. Thank you for your time :)
 
  • #7
Ibix said:
it's an unfortunate coincidence that the Riemann has four indices and spacetime has four dimensions - as I understand it the facts are unrelated

This is correct. You can form the Riemann tensor in manifolds of dimension other than four, and it will still have four indices. What will change is how many independent components there are--the various symmetries of the Riemann tensor mean that not all of its components are independent, and how many are independent changes with the dimension of the manifold. In a 2-dimensional manifold there is only one independent component; in 3 dimensions there are six; in 4 dimensions (as with spacetime) there are twenty; and more still in higher dimensions.
 
  • Like
Likes Ibix
  • #8
fresh_42 said:
The short answer is yes: a matrix is a tensor, however, as you've said, not the other way around.

Umm, isn't it the other way around, a tensor is a matrix but not all matrices are tensors.
 
  • #9
cosmik debris said:
Umm, isn't it the other way around, a tensor is a matrix but not all matrices are tensors.
No. ##\vec{v}_1 \otimes \vec{v}_2 \otimes \vec{v}_3 \,, \, \vec{v}_1 \otimes \vec{v}_2 \otimes \vec{v}_3 \otimes \vec{v}_4\,,\dots \,## etc. are all tensors but not matrices. At least not in the usual sense as a rectangular alignment of coordinates, whereas every matrix can be written as ##(c_0)\, , \, (\vec{v}_0) ## or ##\sum_{i,j} c_{ij} \, \vec{v}_i \otimes \vec{v}_j##.
Whether this is meaningful or not, depends on the situation and goals. E.g. scalars and vectors can be seen as a matrix, too, but normally cubes and higher dimensional cubes aren't meant by the word matrix. If you define a matrix as any ordered alignment of numbers in any dimension, then yes, in this case are tensors matrices, too. But I'd recommend to explicitly state this if used, for otherwise it might lead to confusion.

And as always: We are talking about coordinate representations here. A tensor is only as far a numeric scheme as linear functions are matrices. Their definitions don't need such a representation.
 
  • #10
I've always found it not helpful to try to associate tensors and matrices. It's much more helpful to me to keep the concepts separate. A tensor is a geometrical object, and a matrix is just an arrangement of numbers. You can arrange the components of a tensor in some coordinate system inside of a matrix. That's about all I would like to say about that. You can arrange 3 component tensors in a 3-d "cube" matrix, or you can just simply write the components out in a regular matrix as well (just have separate sub blocks of numbers inside the big matrix block, arranged in some manner). The issue comes when you try to apply matrix multiplication rules and such operations and have them be analogous to tensor operations like contractions or tensor multiplications.

In any case, I find it much easier to understand tensors as geometrical objects, or as functions of vectors/one forms into scalars. Either of those concepts are more intuitive to me than thinking of tensors as "like matrices".
 
  • Like
Likes kent davidge
  • #11
Matterwave said:
I've always found it not helpful to try to associate tensors and matrices. It's much more helpful to me to keep the concepts separate. A tensor is a geometrical object, and a matrix is just an arrangement of numbers. You can arrange the components of a tensor in some coordinate system inside of a matrix. That's about all I would like to say about that. You can arrange 3 component tensors in a 3-d "cube" matrix, or you can just simply write the components out in a regular matrix as well (just have separate sub blocks of numbers inside the big matrix block, arranged in some manner). The issue comes when you try to apply matrix multiplication rules and such operations and have them be analogous to tensor operations like contractions or tensor multiplications.

In any case, I find it much easier to understand tensors as geometrical objects, or as functions of vectors/one forms into scalars. Either of those concepts are more intuitive to me than thinking of tensors as "like matrices".
This totally depends on what is the purpose. Tensors don't automatically have to involve dual spaces. A tensor product is often merely a solution of a couniversal mapping problem, e.g. to prove the existence of general Clifford, Lie or Graßmann algebras.

Here is an example of a pure coordinate based usage of tensors.
 
Last edited:
  • #12
Hi.
I think that tensor is a kind of conserved quantity under transformation, not just a group of numbers.
The product rule of matrices, e.g. matrix times column vector is column vector, row vector times matrix is row vector, tempts me to think that matrix is mixed index tensor. However, contraction rules includes "column to row" and "row to column" so production rule of matrices does not cover tensor contraction rule.
 
Last edited:
  • Like
Likes kent davidge
  • #13
fresh_42 said:
This totally depends on what is the purpose. Tensors don't automatically have to involve dual spaces. A tensor product is often merely a solution of a couniversal mapping problem, e.g. to prove the existence of general Clifford, Lie or Graßmann algebras.

Here is an example of a pure coordinate based usage of tensors.

Since this is the relativity forum, I assumed the OP was talking about tensors in GR (or SR). In GR, we always have a semi-Riemannian manifold (spacetime) on which we define tensors as geometrical objects, or linear functions of vectors and one forms into scalars. I am not familiar with these other ways of viewing or using tensors, so I can't comment on them. If my comment was not sufficiently mathematically rigorous or general, then I apologize, but I'm coming at this problem from the perspective of GR.
 
  • Like
Likes kent davidge
  • #14
Matterwave said:
Since this is the relativity forum, I assumed the OP was talking about tensors in GR (or SR). In GR, we always have a semi-Riemannian manifold (spacetime) on which we define tensors as geometrical objects, or linear functions of vectors and one forms into scalars. I am not familiar with these other ways of viewing or using tensors, so I can't comment on them. If my comment was not sufficiently mathematically rigorous or general, then I apologize, but I'm coming at this problem from the perspective of GR.
You're right. I overlooked the actual section. It can't be seen automatically in my browser, so I tend to neglect it. Sorry. I only checked the "I".

Nevertheless, the OP came up with the idea of a numeric scheme, and he was right as it could be seen as such. I find it as hard to visualize a tensor, e.g. as a four-dimensional geometric object than it is to think of it numerically. Especially since the physical notation often involves a lot of indices, i.e. coordinates. I guess this is a matter of personal preferences. This is why I said: "We are talking about coordinate representations here. A tensor is only as far a numeric scheme as linear functions are matrices. Their definitions don't need such a representation."

My example is by the way also one, that involves dual spaces. It describes a bilinear mapping ##\Phi## as sum of tensors ##f_i \otimes g_i \otimes w_i##, in this case the matrix multiplication in ##\mathbb{M}(2,\mathbb{F})^* \otimes \mathbb{M}(2,\mathbb{F})^* \otimes \mathbb{M}(2,\mathbb{F})##. I only wanted to point out that in the end a tensor is a multi-linear tool; and one doesn't always need to think of vectors as arrows. Sometimes the coordinates will do.
 
  • Like
Likes kent davidge

1. What are tensors?

Tensors are mathematical objects that describe the relationships between multiple dimensions or vectors. They are commonly used in physics and engineering to model complex systems.

2. What is the difference between a tensor and a vector?

A tensor is a generalization of a vector, representing not only magnitude and direction but also how the vector changes in different directions. Vectors are considered first-order tensors, while higher-order tensors have additional components for multiple dimensions.

3. How are tensors used in machine learning?

Tensors are used in machine learning to represent and manipulate multi-dimensional data, such as images or text. They can be used to perform operations like matrix multiplication and convolution, which are essential for training and running deep learning models.

4. Can tensors be visualized?

Yes, tensors can be visualized in several ways depending on the number of dimensions. For example, a 2D tensor can be visualized as a matrix, while a 3D tensor can be visualized as a cube or volume. However, visualizing higher-order tensors can be challenging and often requires specialized software.

5. What is the importance of tensors in physics?

Tensors are crucial in physics because they provide a mathematical framework for describing and analyzing physical quantities, such as velocity, force, and energy. They are used in theories like general relativity and quantum mechanics to model and predict the behavior of complex systems.

Similar threads

  • Special and General Relativity
Replies
25
Views
884
  • Special and General Relativity
Replies
1
Views
512
  • Special and General Relativity
Replies
6
Views
3K
  • Special and General Relativity
Replies
4
Views
662
  • Special and General Relativity
Replies
4
Views
2K
  • Special and General Relativity
Replies
3
Views
799
  • Special and General Relativity
Replies
12
Views
1K
  • Special and General Relativity
Replies
9
Views
3K
  • Special and General Relativity
Replies
6
Views
2K
  • Special and General Relativity
Replies
3
Views
1K
Back
Top