Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

On the Tensor Product

  1. Dec 8, 2004 #1
    on the "Tensor Product"

    In response to some remarks made in the thread "How do particles become entangled?", as well as a number of private messages I have received, I feel there is some need to post some information on the notion of a "tensor product".

    Below, a rather intuitive look at the idea of the "tensor product" is taken. For simplicity, the vector spaces involved are assumed to be finite-dimensional. The infinite-dimensional case can be accommodated with only some minor amendments to the presentation.

    (Note: The usual symbol for the tensor product is an "x" with a "circle" around it, but below I will use the symbol "x".)


    Let U and V be finite-dimensional vector spaces over C with bases {ui} and {vj}, respectively. For each ui and vj , define an object "ui x vj", and construe the full collection of these objects to be a basis for a new vector space W. That is,

    W ≡ {∑ij αij(ui x vj) | αijЄC} ,

    where, by definition,

    ifij αij(ui x vj) = 0 , then αij=0 for all i,j .

    The above then makes W a vector space over C such that

    Dim(W) = Dim(U)∙Dim(V) .

    However ... had we chosen a different set of basis vectors for U or V, then the vector space W thereby obtained would be formally distinct from the one obtained above. There would be no way to 'link' the bases for each of the two W's.

    Let us now introduce some additional 'structure' on the operation "x", such that all W's obtained by the above construction will be formally identical no matter which bases are chosen for U and V. Specifically, we extend the definition of "x" to be bilinear, thus allowing any vector of U to be placed in the left "slot", and any vector of V to be placed in the right "slot". We do this as follows:

    For any u,u'ЄU , v,v'ЄV , and αЄC , let

    (u + u') x v = (u x v) + (u' x v) ,

    u x (v + v') = (u x v) + (u x v') ,

    α(u x v) = (αu) x v = u x (αv) .

    Now all W's are one and the same.

    The next thing we need is an inner product <∙|∙> on W. Let <∙|∙>1 and <∙|∙>2 be the inner products on U and V, respectively. Then, for any u x v and u' x v' Є W , define

    <u x v|u' x v'> ≡ <u|u'>1∙<v|v'>2 .

    Finally, extend <∙|∙> to the whole of W by "antilinearity" in the first slot and "linearity" in the second slot.

    It now follows that <∙|∙> is an inner product on W.

    Moreover, if {ui} and {vj} are orthonormal bases of U and V respectively, then {ui x vj} is an orthonormal basis of W.
    Last edited: Dec 8, 2004
  2. jcsd
  3. Dec 8, 2004 #2

    Kane O'Donnell

    User Avatar
    Science Advisor

    That's rather enlightening, thanks very much. The tensor product appears to have a very similar construction to that of the direct sum.

    Is it true that the tensor product [tex]U\otimes V[/tex] has basically the same set of basis vectors as the direct sum [tex]U\oplus V[/tex] except where:

    [tex] ((u_1, u_2) | (v_1, v_2))_{\oplus} = (u_1|v_1)_1 + (u_2|v_2)_2 [/tex] ​


    [tex] ((u_1, u_2) | (v_1, v_2))_{\otimes} = (u_1|v_1)_{1}\cdot (u_2|v_2)_2[/tex]​

    or is there a further difference that I have missed?


    Kane O'Donnell
  4. Dec 9, 2004 #3
    In attempting to approach the "direct sum" in a manner akin to that employed above with regard to the "tensor product" (also called the "direct product"), we would be forced to begin by defining objects like "ui [+] 0" and "0 [+] vj", and construe the full set of those objects to be a basis for a new vector space WΣ.


    WΣ ≡ { ∑iαi(ui [+] 0) + ∑jβj(0 [+] vj) | αi , βj Є C } ,

    where, by definition,

    ifiαi(ui [+] 0) + ∑jβj(0 [+] vj) = 0 , then αi = βj = 0 for all i,j .

    In this way, WΣ is a vector space over C such that

    Dim(WΣ) = Dim(U) + Dim(V) .

    ... Clearly, the starting point in this construction differs from that of the "tensor product". The difference is such that for a "direct sum" the dimensions of U and V are added, whereas for a "tensor product" those dimensions are multiplied.

    Of course, after the introduction of the appropriate 'structure' on the "[+]" operation, we would then find elements in WΣ of the form "ui [+] vj". However, in contrast to the "tensor product" scenario, these objects would not form a linearly independent set.
  5. Dec 13, 2004 #4
    To Kane: they are related, in mathematics you define the tensor space as a space that keeps a certain diagram is commutative, when the spaces in the diagram are the cross of n vector spaces V_n, the space W, and the tensor space T. (The tranformations between in the diagram are multilinear transformations from the cross to W, a map from the cross to T, and linear transformations from T to W.)
    And it's pretty much all I know about it... when my Professor in classical mechanics started to talk about it, and wasn't capable to explain it properly, I went to a Professor in mathematics (I study both physics and math), and he gave me a quite good definition, sth. like what I wrote above. What Eye wrote, however, is more intuitive (it's a case of n=2 and therefore bilinear transformations).
  6. Dec 13, 2004 #5


    User Avatar
    Science Advisor
    Gold Member

    That's a nice clear explanation, just in case anyone's wondering the connection between the tensor product of two vector spaces and tensors, given the space of (1,0) tensors V, the space of type (m,n) tensors is:

    [tex]\underbrace{V\otimes V\otimes.....\otimes V}_m\otimes\underbrace{V^*\otimes V^*\otimes.....\otimes V^*}_n[/tex]
    Last edited: Dec 13, 2004
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook