kent davidge said:
Question 1 - I know a tensor is not a matrix. But the values of each component of a tensor of the form Aμ1μ2 can be arranged in exactly the same way as in a usual 2-dimensional matrix. I was wondering if it would be possible to represent a Aμ1μ2μ3 tensor as a 3-dimensional matrix, and likewise (although it can not be visualized) a Aμ1...μ∞ tensor as a ∞-dimensional matrix.
Question 2 - Now I've never seen in my linear algebra courses the entries of a matrix A be represented as Aλρ. So how would it look like if we wish, as in "Question 1", to represent the components Aλρ of a tensor as a matrix?
(Sorry for my poor English.)
The short answer is yes: a matrix is a tensor, however, as you've said, not the other way around.
An index ##_\infty## is problematic. One probably wouldn't use coordinates to deal with tensors of infinitely generated modules or vector spaces. Or at least not with all of them at a time.
Since you started with coordinates and leveled the question as "I", my long answer is as follows:
A tensor of rank ##0## is a scalar. A number of the underlying ring or field ##\mathbb{F}##, e.g. ##\mathbb{R}## or ##\mathbb{C}##.
A tensor of rank ##1## is a vector.
A tensor of rank ##2## is a matrix.
A tensor of rank ##3## is a cube.
etc.
If we consider for instance matrices, then they can be written as ##\sum_{i,j=1}^n A_{ij} \vec{e}_{ij}## where ##A_{ij}## denote the matrix elements and the ##\vec{e}_{ij}## the basis matrices with a ##1## at position ##(i,j)## and ##0## elsewhere.
Now every ##\vec{e}_{ij}## can be written as a tensor ##\vec{e}_{ij}=\vec{e}_{j} \otimes \vec{e}_{i} = \vec{e}_{j} \,\cdot\, \vec{e}_{i}^{\tau}##. Here I wrote ##\vec{e}_j = (0, \dots , 1, \dots , 0)## as a row vector.
Sometimes generic products ##v_1 \otimes \dots \otimes v_k## are referred to as tensors, but rigorously all linear combinations of these are tensors of rank ##k## in this case. As the equation with the matrix shows, a rank ##2## tensor ##v_1 \otimes v_2## is a matrix of matrix rank ##1##. To get all matrices, and therewith all tensors of rank ##2##, one has to allow all linear combinations.
This procedure can be done on every (finite) rank ##k##.
The
tensor space or better
tensor algebra itself is then the sum of all these, i.e. on a vector space ##V## it is ##\mathcal{T}(V)=\mathbb{F} \oplus V \oplus (V \otimes V) \oplus (V \otimes V \otimes V) \oplus \dots##
One has to be a little bit careful here, since a
tensor product of two vectorspaces ##U## and ##V## is indeed only the linear span of all elements ##u \otimes v## which can be viewed as a matrix, i.e. a tensor of rank ##2##. The representation as such a linear combination by the way isn't unique in general, since e.g. ##r \cdot u \otimes v = u \otimes r \cdot v##.
Also other tensors of fixed rank ##k## could be referred to as tensor product ##V_1 \otimes \dots \otimes V_k##.
In the context of physics there has also to be mentioned, that some of these (or all) vector spaces might as well be vector spaces of linear functions.