Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Understanding tensor product

  1. Feb 4, 2017 #1
    Hello,

    I have encountered the concept of tensor product between two or more different vector spaces. I would like to get a more intuitive sense of what the final product is.

    Say we have two vector spaces ##V_1## of dimension 2 and ##V_2## of dimension 3. Each vector space has a basis that we have choose among the many possible ones. For ##V_1## the basis set is ##a_n=(a_1, a_2))##.
    For ##V_2## the basis set is ##b_n=(b_1, b_2, b_3))##.
    The tensor product between ##V_1## and ##V_2## is $$V_3= V_1 \otimes V_2$$
    What type of vectors live in this new vector space ##V_3##?
    What can we say about the basis for the vector space ##V_3##?
    What is the idea behind calculating the tensor product the two vector spaces? What are we trying to capture?

    Thanks for any insight,
    Fog37
     
  2. jcsd
  3. Feb 4, 2017 #2

    fresh_42

    Staff: Mentor

    All linear combinations of vectors ##v_1 \otimes v_2## with ##v_i \in V_i##.
    It is, e.g. ##\{a_i \otimes b_j\,\vert \, i \in \{1,2\} \wedge j \in \{1,2,3\}\}##.
    You can write a tensor ##a_i \otimes b_j## as a ##(2 \times 3)-## matrix of rank ##1## or in coordinates
    $$ a_i \otimes b_j = \begin{bmatrix}a_{i1}b_{j1}&a_{i2}b_{j1}\\a_{i1}b_{j2}&a_{i2}b_{j2}\\a_{i1}b_{j3}&a_{i2}b_{j3}\end{bmatrix} $$
    This way tensors can be considered, e.g. as multilinear transformations. Beside this they have a universal property, that allows algebras to be considered as a quotient space of tensor algebras, which makes them a kind of prototype which its relations are yet missing.
     
  4. Feb 4, 2017 #3
    Thanks Fresh_42,

    Let me summarize what I understood and change notation for more clarity. The two initial vector spaces are ##V## and ##W## and the tensor product space is ##K=V\otimes W##.

    1) The elements (vectors) living in the new tensor product space ##K## are the objects like ##v_1 \otimes w_2## where ##v_1## is ANY vector in ##V_1## and ##w_2## is one specific vector living in ##W##. All possible combinations (or permutations) of pairs of vectors in the two spaces are the vectors living in ##K##.

    For example, the vectors ##v_1 \otimes w_2##, ##v_2 \otimes w_6##, ##v_4 \otimes w_1##, ##v_1 \otimes w_9##, ##v_2\otimes w_2##, etc. are just some of the vectors living in ##K##. There are infinite combinations so infinite vectors in ##K##.

    2) Each basis vector is also a combination between the basis vectors of the two spaces: ##a_i \otimes b_j## is actually a matrix. Every vector, basis vector or not, in ##K## is a matrix and not a column or row vector (sequence of numbers).

    3) Could you help me with a simple numerical example to illustrate how things work? For instance, let's consider the vector ##v_1 \otimes w_2## where ##v_1=(2,3)## in its basis ##(a_1, a_2)## and ##w_2=(3, -5, 1)## in its basis ##(b_1, b_2, b_3)##. What would the vector ##v_1 \otimes w_2## look like both in vector notation and in matrix notation?

    4) When we look at any vector in ##K##, and its matrix representation, what should we infer? Should we look at it as a particular mixture, i.e. the possible products, of some the vectors from the two starting vector spaces? What is the overarching idea again?

    Thanks for the patience.
    Fog37
     
  5. Feb 4, 2017 #4

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I assume from your previous posts that this relates to QM. To take the example of two Spin-1/2 particles, each of which has its states in a 2D space. The most obvious way to combine these would be the Cartesian product. This would be a 4D space:

    ##U \times V = \lbrace (a, b, c, d): (a, b) \in U, \ (c, d) \in V \rbrace##

    The problem is that a natural basis is ##(1,0, 0, 0), (0, 1, 0, 0) \dots ##.

    But these would represent the first particle being Spin-up or down and the second having no state. So, clearly the Cartesian product doesn't capture the concept of both particles having a state.

    By contrast, the Tensor product gives us a 4D space where each basis vector is a combination of one from each space:

    ##U \otimes V = \lbrace (a, b, c, d) \rbrace = \lbrace a( |+ \rangle \otimes |+ \rangle) + b( |+ \rangle \otimes |- \rangle) + c( |- \rangle \otimes |+ \rangle) + d( |- \rangle \otimes |- \rangle) \rbrace ##

    And this is a vector space where every vector represents a state of both particles.
     
  6. Feb 4, 2017 #5

    fresh_42

    Staff: Mentor

    ##w_2## can also be ANY vector of ##W##. All possible sums and multiples, i.e. all linear combinations of vectors of the form ##v_1 \otimes w_2## are in ##K##. You cannot switch the two, i.e. ##v_1 \otimes w_2 \, {\neq}_{i.g.} \, w_2 \otimes v_1##. In the example above with the coordinates one would get a transposed matrix and thus a ##(3 \times 2)-##matrix instead of a ##(2 \times 3)-##matrix. The sum itself is of course independent of the order of summation as it is a finite sum: ##\sum_{i=1}^n \sum_{j=1}^m \, c_{ij} v_i \otimes w_j##
    Yes, but not only the binary ##v \otimes w## but also the sums and multiples (see above). You can construct every matrix as a linear combination of them, not only rank ##1## matrices.
    In coordinates, yes. And in higher dimensions you get a cube and so on.
    It would be
    $$v_1 \otimes w_2 = (2,3) \otimes (3,-5,1) = (2,3)^t \cdot (3,-5,1) = \begin{pmatrix}2 \\ 3 \end{pmatrix} \cdot (3,-5,1) = \begin{bmatrix}6&-10&2\\9 &-15&3\end{bmatrix}$$
    Maybe I confused the orientation before, but that doesn't change the principle, as long as it's consistent.
    This really depends on the context and what you are planning to do. As I've said, tensors have a universal property that allows them to play different roles. In the coordinate presentation above, you could view them as a bilinear mapping ##\beta : V \times W \rightarrow \mathbb{R}## with ##\beta(X,Y)= X (v_1 \otimes w_2) Y^t##. They can be taken from dual vector spaces ##V^* = \{\varphi : V \rightarrow \mathbb{R}\,\vert \, \varphi \, \textrm{ is } \mathbb{R}-\textrm{linear}\}## or a mixture of both, which is often the case in physics.

    You can read a bit more about them here: https://en.wikipedia.org/wiki/Tensor
    and here: https://en.wikipedia.org/wiki/Tensor_algebra (but I would omit the coalgebra and coproduct part).
     
  7. Feb 6, 2017 #6

    lavinia

    User Avatar
    Science Advisor

    If the two vector spaces are the spaces of functions(with values in the field) defined on two different sets then the tensor product is simply the product of the two functions ##(f⊗g)(x,y) = f(x)g(y)## Bilinearity of the tensor product follows from bilinearity of multiplication in the field.

    Any vector space can be interpreted as the space of functions from some set into a field. For instance a two dimensional vector space over the complex numbers is all functions from a set with two elements into the complex numbers. A real vector space of 20 dimensions is all real valued functions from a set with 20 elements into the real numbers.

    If for instance, one writes a vector ##v## in terms of a basis as ##v = Σ_{i}a_{i}x_{i}## then one is defining a function on the set ##{x_{1},...,x_{n}}## by the rule ##f(x_{i}) = a_{i}##. So ##v## can be viewed as the function ##f##. This way of looking at vectors is inherent in the index notation found in Physics.

    In this thread, you have been considering the tensor product in another way as the set of all formal linear combinations of symbols ##Σ_{k}a_{k}v_{k}⊗w_{k}## modulo the relations that make the tensor product bilinear. These are: the symbol ##(v_{1}+v_{2})⊗w## is the same as the symbol ##v_{1}⊗w + v_{2}⊗w ##, the symbol ##v⊗(w_{1}+w{2})## is the same as ##v⊗w_{1}+ v⊗w{2}## and ##a(v⊗w)## is the same as ##av⊗w## is the same as ## v⊗aw##.

    BTW:

    There is nothing in the definition of the tensor product that relies on the scalars being in a field. A field has the strong condition that every scalar has a multiplicative inverse. But the definition of tensor product only requires commutative addition and multiplication and multiplication distributing over addition. This works for instance with the integers or for the ring ##Z_4## neither of which have inverses. Multiplication is usually required to be commutative but one can also define tensor products when it is not commutative though the definition needs to be modified somewhat. Tensor products where the scalars are not in a field are important in many areas of mathematics e.g. in algebraic topology and in homological algebra.
     
    Last edited: Feb 6, 2017
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Understanding tensor product
  1. Tensor Product. (Replies: 11)

  2. Tensor product? (Replies: 7)

  3. Tensor product (Replies: 1)

  4. Tensor product (Replies: 1)

Loading...