How is the Tensor Product Defined and Used in Vector Spaces?

Click For Summary

Discussion Overview

The discussion revolves around the definition and application of the tensor product in vector spaces, particularly focusing on finite-dimensional cases. Participants explore the construction of the tensor product, its properties, and its relationship to other mathematical structures such as the direct sum.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant presents an intuitive explanation of the tensor product, defining it through bilinear operations and establishing a vector space W from bases of two finite-dimensional vector spaces U and V.
  • Another participant, Kane O'Donnell, questions the similarities and differences between the tensor product and the direct sum, specifically regarding their basis vectors and the operations involved.
  • A third participant discusses the construction of the direct sum, noting that it involves a different approach than the tensor product, leading to the addition of dimensions rather than multiplication.
  • One participant provides a mathematical definition related to the tensor space and its connection to multilinear transformations, indicating a more abstract understanding of the tensor product.
  • Another participant clarifies the relationship between the tensor product of vector spaces and tensors, specifically mentioning the construction of type (m,n) tensors from the tensor product of vector spaces and their duals.

Areas of Agreement / Disagreement

Participants express varying degrees of understanding and interpretation of the tensor product and its relationship to the direct sum. No consensus is reached on the nuances of these concepts, and multiple competing views remain regarding their definitions and applications.

Contextual Notes

Some limitations in the discussion include the dependence on specific definitions of operations and the assumptions made about the dimensions of the vector spaces involved. The discussion does not resolve the mathematical steps or the implications of the constructions presented.

Eye_in_the_Sky
Messages
331
Reaction score
4
on the "Tensor Product"

In response to some remarks made in the thread "How do particles become entangled?", as well as a number of private messages I have received, I feel there is some need to post some information on the notion of a "tensor product".

Below, a rather intuitive look at the idea of the "tensor product" is taken. For simplicity, the vector spaces involved are assumed to be finite-dimensional. The infinite-dimensional case can be accommodated with only some minor amendments to the presentation.

(Note: The usual symbol for the tensor product is an "x" with a "circle" around it, but below I will use the symbol "x".)

----------------------------------------

Let U and V be finite-dimensional vector spaces over C with bases {ui} and {vj}, respectively. For each ui and vj , define an object "ui x vj", and construe the full collection of these objects to be a basis for a new vector space W. That is,

W ≡ {∑ij αij(ui x vj) | αijЄC} ,

where, by definition,

ifij αij(ui x vj) = 0 , then αij=0 for all i,j .

The above then makes W a vector space over C such that

Dim(W) = Dim(U)∙Dim(V) .

However ... had we chosen a different set of basis vectors for U or V, then the vector space W thereby obtained would be formally distinct from the one obtained above. There would be no way to 'link' the bases for each of the two W's.

Let us now introduce some additional 'structure' on the operation "x", such that all W's obtained by the above construction will be formally identical no matter which bases are chosen for U and V. Specifically, we extend the definition of "x" to be bilinear, thus allowing any vector of U to be placed in the left "slot", and any vector of V to be placed in the right "slot". We do this as follows:

For any u,u'ЄU , v,v'ЄV , and αЄC , let

(u + u') x v = (u x v) + (u' x v) ,

u x (v + v') = (u x v) + (u x v') ,

α(u x v) = (αu) x v = u x (αv) .

Now all W's are one and the same.

The next thing we need is an inner product <∙|∙> on W. Let <∙|∙>1 and <∙|∙>2 be the inner products on U and V, respectively. Then, for any u x v and u' x v' Є W , define

<u x v|u' x v'> ≡ <u|u'>1∙<v|v'>2 .

Finally, extend <∙|∙> to the whole of W by "antilinearity" in the first slot and "linearity" in the second slot.

It now follows that <∙|∙> is an inner product on W.

Moreover, if {ui} and {vj} are orthonormal bases of U and V respectively, then {ui x vj} is an orthonormal basis of W.
 
Last edited:
Physics news on Phys.org
That's rather enlightening, thanks very much. The tensor product appears to have a very similar construction to that of the direct sum.

Is it true that the tensor product U\otimes V has basically the same set of basis vectors as the direct sum U\oplus V except where:

((u_1, u_2) | (v_1, v_2))_{\oplus} = (u_1|v_1)_1 + (u_2|v_2)_2​

whereas

((u_1, u_2) | (v_1, v_2))_{\otimes} = (u_1|v_1)_{1}\cdot (u_2|v_2)_2​

or is there a further difference that I have missed?

Regards,

Kane O'Donnell
 
In attempting to approach the "direct sum" in a manner akin to that employed above with regard to the "tensor product" (also called the "direct product"), we would be forced to begin by defining objects like "ui [+] 0" and "0 [+] vj", and construe the full set of those objects to be a basis for a new vector space WΣ.

Specifically,

WΣ ≡ { ∑iαi(ui [+] 0) + ∑jβj(0 [+] vj) | αi , βj Є C } ,

where, by definition,

ifiαi(ui [+] 0) + ∑jβj(0 [+] vj) = 0 , then αi = βj = 0 for all i,j .

In this way, WΣ is a vector space over C such that

Dim(WΣ) = Dim(U) + Dim(V) .


... Clearly, the starting point in this construction differs from that of the "tensor product". The difference is such that for a "direct sum" the dimensions of U and V are added, whereas for a "tensor product" those dimensions are multiplied.

Of course, after the introduction of the appropriate 'structure' on the "[+]" operation, we would then find elements in WΣ of the form "ui [+] vj". However, in contrast to the "tensor product" scenario, these objects would not form a linearly independent set.
 
To Kane: they are related, in mathematics you define the tensor space as a space that keeps a certain diagram is commutative, when the spaces in the diagram are the cross of n vector spaces V_n, the space W, and the tensor space T. (The tranformations between in the diagram are multilinear transformations from the cross to W, a map from the cross to T, and linear transformations from T to W.)
And it's pretty much all I know about it... when my Professor in classical mechanics started to talk about it, and wasn't capable to explain it properly, I went to a Professor in mathematics (I study both physics and math), and he gave me a quite good definition, sth. like what I wrote above. What Eye wrote, however, is more intuitive (it's a case of n=2 and therefore bilinear transformations).
 
That's a nice clear explanation, just in case anyone's wondering the connection between the tensor product of two vector spaces and tensors, given the space of (1,0) tensors V, the space of type (m,n) tensors is:

\underbrace{V\otimes V\otimes...\otimes V}_m\otimes\underbrace{V^*\otimes V^*\otimes...\otimes V^*}_n
 
Last edited:

Similar threads

  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 12 ·
Replies
12
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K