fog37 said:
Thanks Fresh_42,
Let me summarize what I understood and change notation for more clarity. The two initial vector spaces are ##V## and ##W## and the tensor product space is ##K=V\otimes W##.
1) The elements (vectors) living in the new tensor product space ##K## are the objects like ##v_1 \otimes w_2## where ##v_1## is ANY vector in ##V_1## and ##w_2## is one specific vector living in ##W##. All possible combinations (or permutations) of pairs of vectors in the two spaces are the vectors living in ##K##.
##w_2## can also be ANY vector of ##W##. All possible sums and multiples, i.e. all linear combinations of vectors of the form ##v_1 \otimes w_2## are in ##K##. You cannot switch the two, i.e. ##v_1 \otimes w_2 \, {\neq}_{i.g.} \, w_2 \otimes v_1##. In the example above with the coordinates one would get a transposed matrix and thus a ##(3 \times 2)-##matrix instead of a ##(2 \times 3)-##matrix. The sum itself is of course independent of the order of summation as it is a finite sum: ##\sum_{i=1}^n \sum_{j=1}^m \, c_{ij} v_i \otimes w_j##
For example, the vectors ##v_1 \otimes w_2##, ##v_2 \otimes w_6##, ##v_4 \otimes w_1##, ##v_1 \otimes w_9##, ##v_2\otimes w_2##, etc. are just some of the vectors living in ##K##. There are infinite combinations so infinite vectors in ##K##.
Yes, but not only the binary ##v \otimes w## but also the sums and multiples (see above). You can construct every matrix as a linear combination of them, not only rank ##1## matrices.
2) Each basis vector is also a combination between the basis vectors of the two spaces: ##a_i \otimes b_j## is actually a matrix. Every vector, basis vector or not, in ##K## is a matrix and not a column or row vector (sequence of numbers).
In coordinates, yes. And in higher dimensions you get a cube and so on.
3) Could you help me with a simple numerical example to illustrate how things work? For instance, let's consider the vector ##v_1 \otimes w_2## where ##v_1=(2,3)## in its basis ##(a_1, a_2)## and ##w_2=(3, -5, 1)## in its basis ##(b_1, b_2, b_3)##. What would the vector ##v_1 \otimes w_2## look like both in vector notation and in matrix notation?
It would be
$$v_1 \otimes w_2 = (2,3) \otimes (3,-5,1) = (2,3)^t \cdot (3,-5,1) = \begin{pmatrix}2 \\ 3 \end{pmatrix} \cdot (3,-5,1) = \begin{bmatrix}6&-10&2\\9 &-15&3\end{bmatrix}$$
Maybe I confused the orientation before, but that doesn't change the principle, as long as it's consistent.
4) When we look at any vector in ##K##, and its matrix representation, what should we infer? Should we look at it as a particular mixture, i.e. the possible products, of some the vectors from the two starting vector spaces? What is the overarching idea again?
This really depends on the context and what you are planning to do. As I've said, tensors have a universal property that allows them to play different roles. In the coordinate presentation above, you could view them as a bilinear mapping ##\beta : V \times W \rightarrow \mathbb{R}## with ##\beta(X,Y)= X (v_1 \otimes w_2) Y^t##. They can be taken from dual vector spaces ##V^* = \{\varphi : V \rightarrow \mathbb{R}\,\vert \, \varphi \, \textrm{ is } \mathbb{R}-\textrm{linear}\}## or a mixture of both, which is often the case in physics.
You can read a bit more about them here:
https://en.wikipedia.org/wiki/Tensor
and here:
https://en.wikipedia.org/wiki/Tensor_algebra (but I would omit the coalgebra and coproduct part).