Tensor product of vector spaces

In summary, the tensor product V\otimes W of two vector spaces V and W is defined as the cartesian product V\times W with the addition operation defined as (v,w) + (v',w') = (v+v',w+w'), where v and v' are in V, w and w' are in W, and addition of vectors is defined within each individual vector space. This is done in order to construct the tensor product as a space where a bilinear functional can be defined, allowing for the construction of the tensor product space as a unique and existing vector space.
  • #1
Fredrik
Staff Emeritus
Science Advisor
Gold Member
10,877
422
I'm reading the Wikipedia article, trying to understand the definition of the tensor product [itex]V\otimes W[/itex] of two vector spaces V and W. The first step is to take the cartesian product [itex]V\times W[/itex]. The next step is to define the "free vector space" [itex]F(V\times W)[/itex] as the set of all linear combinations of members of [itex]V\times W[/itex]. But how does that make sense when we haven't even defined the sum of two members of [itex]V\times W[/itex]?

I'm tempted to interpret the linear combination as just a string of text at this point, but then I can't make sense of the claim that the [itex]e_{v\times w}[/itex] are taken to be linearly independent for distinct [itex]v\times w[/itex].

Can someone help me make sense of this definition?
 
Physics news on Phys.org
  • #2
Fredrik said:
I'm reading the Wikipedia article, trying to understand the definition of the tensor product [itex]V\otimes W[/itex] of two vector spaces V and W. The first step is to take the cartesian product [itex]V\times W[/itex]. The next step is to define the "free vector space" [itex]F(V\times W)[/itex] as the set of all linear combinations of members of [itex]V\times W[/itex]. But how does that make sense when we haven't even defined the sum of two members of [itex]V\times W[/itex]?

I'm tempted to interpret the linear combination as just a string of text at this point, but then I can't make sense of the claim that the [itex]e_{v\times w}[/itex] are taken to be linearly independent for distinct [itex]v\times w[/itex].

Can someone help me make sense of this definition?

Remember a vector space involves always specifying field F first. If V and W are vector spaces over a field F, then indeed the Cartesian product of V and W is a vector space over F via the following rule: let v,v' be in V and w,w' be in W, then define (v,w) + (v',w')=(v+v',w+w') where v+v' is the addition operation in V and w+w' is the addition operation in W.

Let h be a bilinear functional from VxW to F. The tensor product of V and W is the unique space [tex] $ V \otimes W $[/tex] such that there exists a linear functional [tex] $ h_{*} : V \otimes W \rightarrow F $ [/tex] such that [tex] $ h_{*} (x \otimes y) =h(x,y) $ [/tex]. This space always exists and as said before is unique. The construction of the space is what you are considering. I should mention that my definition is a little bit restrictive. I chose to work with F but you could extend the definition to replace F with any vector space over F (note F is a vector space over itself). The construction is simple:

Consider the sub space Z generated by all elements of the form (v+v', w)-(v,w) -(v',w') and (v,w+w')-(v,w)-(v,w'). Then take the tensor product of V and W to be VxW/Z.
 
  • #3
eastside00_99 said:
let v,v' be in V and w,w' be in W, then define (v,w) + (v',w')=(v+v',w+w') where v+v' is the addition operation in V and w+w' is the addition operation in W.
Is that really the correct rule here? It's a very natural and obvious way to define a vector space structure on the Cartesian product, and it is the way to do it when we're defining the "direct sum" of two vector spaces. But this is the tensor product. This is supposed to be the way we use the Hilbert space of one-particle states in quantum mechanics to construct the Hilbert space of two-particle states (by taking the tensor product of the space of one-particle states with itself). And doesn't this hold for two-particle states?

[tex](|v\rangle+|v'\rangle)(|w\rangle+|w'\rangle)
=|v\rangle|w\rangle+|v\rangle|w'\rangle+|v'\rangle|w\rangle+|v'\rangle|w'\rangle[/tex]​

The left-hand side is what corresponds to (v+v',w+w'), but the right-hand side doesn't look like (v,w) + (v',w'). It has two extra terms.

Thanks for replying by the way. I haven't given much thought to the other things you said yet, but I will.
 
  • #4
Fredrik said:
I'm tempted to interpret the linear combination as just a string of text at this point,
That's a good way to think of it -- one of the most general notions of 'algebraic structure' consists of two parts:

(1) Given any set S, a definition of all valid 'algebraic expressions' generated from S

In this particular case, you can (almost) view F(S) as the set of all textual strings of the form [itex]r_1 s_1 + r_2 s_2 + \cdots + r_n s_n[/itex], where the r's are scalars and the s's are elements of S, together with the textual string 0. You just have to take some care to avoid redundancy; for example, you could insist each of the s's are distinct, none of the r's are zero, and that two strings with permuted terms are the same.

(2) A rule for simplifying 'expressions of expressions'

For example, if v and w are two elements of F(S), you could form the 'formal' expression v + w (i.e. an element of F(F(S))), and there is an obvious way to 'simplify' such an expression back into F(S). The same is true for scalar multiplication, and this gives you a vector space strcutre on F(S)
 
Last edited:
  • #5
Sorry, my last sentence should read:Consider the sub space Z generated by all elements of the form (v+v', w)-(v,w) -(v',w)
[not (v+v', w)-(v,w) -(v',w')] and (v,w+w')-(v,w)-(v,w'). Then take the tensor product of V and W to be VxW/Z.Your question is what is addition in VxW so that you could form F(VxW). What F(VxW) is is the vector space over F whose bases is {(v,w)| v in V and w in W}. So, I was wrong about addition. But, in the case the two definitions will coincide in the finite case and where V intersect W = empty set.
 
  • #6
I guess I forgot to add how what I was talking about (monads) connects with the usual way of specifying a vector space.

The main point is that to say V is a vector space with underlying set V amounts to specifying a map F(V) --> V -- i.e. a way to 'evaluate' a linear combination of elements of V. The expressions v + w and r v are both elements of F(V), so this map does indeed tell you how to add vectors and how to perform scalar multiplication.
 
  • #7
Thanks for the contributions so far, both of you. I still don't get it, but I'll take another look at it tomorrow.
 
  • #8
OK, I think I understand the definition now. This comment was helpful:
Hurkyl said:
In this particular case, you can (almost) view F(S) as the set of all textual strings of the form [itex]r_1 s_1 + r_2 s_2 + \cdots + r_n s_n[/itex], where the r's are scalars and the s's are elements of S, together with the textual string 0. You just have to take some care to avoid redundancy; for example, you could insist each of the s's are distinct, none of the r's are zero, and that two strings with permuted terms are the same.
This means that F(S) (and I assume that by S you mean VxW) isn't the set of all strings that look like linear combinations of members of S. The members of F(S) are some equivalence classes of such strings. In particular s+s' is considered equivalent to s'+s, and (s+s')+s'' is considered equivalent to s+(s'+s''). Wikipedia didn't even mention this equivalence relation. Instead they just said something cryptic about linear independence as an explanation of why they wrote [itex]e_{u\times v}[/itex] instead of [itex](u,v)[/itex].

Wikipedia then defines a second equivalence class, ~, and defines the tensor product space to be F(UxV)/~. I think I understand that part (now, not yesterday).

I looked up the word "monad" and it was one of those "head asplode" moments. It would probably take a week or more to understand the definition, so I'm not going to try any time soon. I actually find this stuff interesting, but it's just too much work.
 
  • #9
eastside00_99 said:
Let h be a bilinear functional from VxW to F. The tensor product of V and W is the unique space [tex] $ V \otimes W $[/tex] such that there exists a linear functional [tex] $ h_{*} : V \otimes W \rightarrow F $ [/tex] such that [tex] $ h_{*} (x \otimes y) =h(x,y) $ [/tex]. This space always exists and as said before is unique.
I don't understand this. How can we even make sense of the notation [itex]x\otimes y[/itex] when we define [itex]V \otimes W[/itex] as "the unique space such that..."? Hm, I suppose you must have meant "the unique space with members that are equivalence classes of strings of text that look like linear combinations of members of UxV", or something like that.

eastside00_99 said:
Consider the sub space Z generated by all elements of the form (v+v', w)-(v,w) -(v',w)
[not (v+v', w)-(v,w) -(v',w')] and (v,w+w')-(v,w)-(v,w'). Then take the tensor product of V and W to be VxW/Z.
I don't understand this either. I'm not sure what you mean by "generated by" in this context. Is it the same as "spanned by", i.e. that you take elements of that form to be basis vectors of Z? Is this method less complicated than Wikipedia's method?
 
  • #10
Fredrik said:
I don't understand this. How can we even make sense of the notation [itex]x\otimes y[/itex] when we define [itex]V \otimes W[/itex] as "the unique space such that..."? Hm, I suppose you must have meant "the unique space with members that are equivalence classes of strings of text that look like linear combinations of members of UxV", or something like that.


I don't understand this either. I'm not sure what you mean by "generated by" in this context. Is it the same as "spanned by", i.e. that you take elements of that form to be basis vectors of Z? Is this method less complicated than Wikipedia's method?



technically you can write [tex] $ x\otimes y $ [/tex]. The tensor product is the bilinear map denoted [tex] $ \otimes $ [/tex] from VxW to some vector space H over F such that given another vector space N over F and a bilinear map f: VxW --> N, you have a homomorphism f_* from H to N such that [tex] $ f = f_{*} \circ \otimes $ [/tex]. the image of a point (x,y) in VxW under [tex] $ \otimes $ [/tex] is denoted by [tex] $ x\otimes y $ [/tex]. So technically it is okay to write [tex] $ x\otimes y $ [/tex] and is the correct thing to say, but I probably should have stressed that the tensor product is a surjective bilinear function from VxW to this mysterious space H though.

Anyway, the two definitions are equivalent -- i.e., you can show this construction you are considering satisfies the above and is UNIQUE -- so do it which ever way you like, but I would start with think functorially about it and then do the construction.

Yes, by generated, I mean the span.
 
Last edited:
  • #11
So a formal definition would look something like this?

If V,W,X,Y are vector spaces over the same field F and there exist two bilinear surjections [itex]f:V\times W\rightarrow X[/itex] and [itex]\otimes:V\times W\rightarrow Y[/itex] and a function [itex]f_*:Y\rightarrow X[/itex] such that [itex]f=f_*\circ\otimes[/itex], then Y is said to be the tensor product space of V and W, and we write [itex]V\otimes W[/itex] instead of Y.

And the uniqueness would mean something like this?

If, in addition to the above, X',Y' are vector spaces over the same field F and there exist two bilinear surjections [itex]f':V\times W\rightarrow X'[/itex] and [itex]\otimes':V\times W\rightarrow Y'[/itex] and a function [itex]f'_*:Y'\rightarrow X'[/itex] such that [itex]f'=f'_*\circ\otimes'[/itex], then Y' is isomorphic to Y.
 
  • #12
Fredrik said:
So a formal definition would look something like this?

If V,W,X,Y are vector spaces over the same field F and there exist two bilinear surjections [itex]f:V\times W\rightarrow X[/itex] and [itex]\otimes:V\times W\rightarrow Y[/itex] and a function [itex]f_*:Y\rightarrow X[/itex] such that [itex]f=f_*\circ\otimes[/itex], then Y is said to be the tensor product space of V and W, and we write [itex]V\otimes W[/itex] instead of Y.

This is very close.

Let V,W, X be vector spaces over F. Let [itex]\tau: V \times W \rightarrow X [/itex] be a bilinear map. We say that [itex]\tau[/itex] is a direct product if given any vector space Y over F and any bilinear map [itex] f: V \times W \rightarrow Y [/itex], then there exists a linear map [itex] f_{*}: X \rightarrow Y [/itex] such that [itex] f=f_{*}\circ\tau [/itex].

It is easy to show, abstractly--i.e., without constructing the tensor product of V and W--that [itex] \tau [/itex] and X are unique. And, so when we have such a bilinear function [itex] \tau [/itex] and a F-vector space X, we may write [itex]\otimes[/itex] instead of [itex] \tau [/itex] and [itex] V\otimes W[/itex] instead of X. Instead of [itex]\otimes(x,y)[/itex], we write [itex]x\otimes y[/itex].
 
Last edited:
  • #13
I'm going to make things a little more concrete (but still abstract) by outlining the construction of a tensor product space.

First, a little about vector spaces.

A (Hamel) basis for a vector space [itex]V[/itex] is a subset [itex]B[/itex] of [itex]V[/itex] that is linearly independent, and that spans [itex]V[/itex]. Even if [itex]V[/itex] is infinite-dimensional, the concepts of linear independence and span involve only linear combinations of a finite number of vectors. In fact, without extra structure on [itex]V[/itex], it doesn't even make sense to talk about the sum of an infinite number of vectors. An infinite sum is a limit of the sequence of partial sums, and, without extra structure, the concept of limit can't be defined.

If [itex]S[/itex] is a set (of, e.g., distinguishable oranges), then the free vector space [itex]F \left( S \right)[/itex] is the vector space that has [itex]S[/itex] as a basis, i.e., the set of all (formal) finite linear combinations of elements of [itex]S[/itex].

What is a linear combination of oranges?

A concrete, rigorous realization of the above follows.

Let [itex]S[/itex] be a set. Define [itex]F \left( S \right)[/itex] to be the set of functions that map into a field, say [itex]\mathbb{R}[/itex], and such that each function is non-zero for only fintely many elements (in general, different for different functions). This finiteness will be used to reflect the fact that only sums of finite numbers of vectors are allowed. The definitions

[tex]\left( f + g \right) \left( s \right) := f \left( s \right) + g \left( s \right)[/tex]

[tex]\left( \alpha f \right) \left( s \right) := \alpha f \left( s \right)[/tex]

for [itex]f[/itex] and [itex]g[/itex] in [itex]F \left( S \right)[/itex] and [itex]\alpha \in \mathbb{R}[/itex] give [itex]F \left( S \right)[/itex] vector space structure.

[itex]F \left( S \right)[/itex] is the free vector space on set [itex]S[/itex]. To see how [itex]F \left( S \right)[/itex] captures the idea of linear combinations of set [itex]S[/itex], consider the following functions.

For each [itex]s \in S[/itex], define an element [itex]e_s \in F \left( S \right)[/itex] by

[tex]
e_s \left( s' \right) = \left( \begin{array}{cc} 1 & s=s'\\ 0 & s \neq s' \end{array} \right
[/tex]

Clearly, there is a bijection from [itex]S[/itex] to the set of all such functions. But each of these functions does live in vector space [itex]F \left( S \right)[/itex], so when we talk about linear combinations of elements of [itex]S[/itex], we really mean linear combinations of the appropriate functions [itex]e_s[/itex]. It is fairly easy to show that the set of all such [itex]e_s[/itex] is a basis for [itex]F \left( S \right)[/itex].

If [itex]V[/itex] and [itex]W[/itex] are vector spaces, then applying the above to set [itex]V \times W[/itex] produces vector space [itex]F \left( V \times W \right)[/itex]. The tensor product space [itex]V \otimes W[/itex] is found by forming a quotient vector space of the free vector space [itex]F \left( V \times W \right)[/itex] with an appropriate subspace. The subspace acts as the zero vector in the quotient space.

Since [itex]\left( v , w \right)[/itex], [itex]\left( \alpha v' , w \right)[/itex], and [itex]\left( v + \alpha v' , w \right)[/itex] are all distinct elements of [itex]V \times W[/itex], [itex]e_{\left( v , w \right)}[/itex], [itex]e_{\left( \alpha v' , w \right)|[/itex], and [itex]e_{\left( v + \alpha v' , w \right)}[/itex] are linearly independent in [itex]F \left( V \times W \right)[/itex], since they are all basis elements. Consequently,

[tex]e_{\left( v , w \right)} + e_{\left( \alpha v' , w \right)} - e_{\left( v + \alpha v' , w \right)}[/tex]

and, similarly,

[tex]e_{\left( v , w \right)} + e_{\left( v , \alpha w' \right)} - e_{\left( v , w + \alpha w' \right)}[/tex]

are non-zero in [itex]F \left( V \times W \right)[/itex]. But, we want

[tex]v \otimes w + \alpha v' \otimes w - \left( v + \alpha v' \right) \otimes w = 0[/tex]

[tex]v \otimes w + v \otimes \alpha w' - v \otimes \left( w + \alpha w' \right) = 0.[/tex]

Consequently, use

[tex]e_{\left( v , w \right)} + e_{\left( \alpha v' , w \right)} - e_{\left( v + \alpha v' , w \right)}[/tex]

[tex]e_{\left( v , w \right)} + e_{\left( v , \alpha w' \right)} - e_{\left( v , w + \alpha w' \right)}[/tex]

to generate a subspace [itex]U[/itex] of [itex]F \left( V \times W \right)[/itex].

Then [itex]V \otimes W[/itex] is [itex]F \left( V \times W \right) / U[/itex].

Another way to think of quotient vector spaces is in terms of groups. Any vector space is an abelian group with vector addition the group product and the zero vector the group identity. Any subspace is a normal subgroup, and thus can be used to form a quotient group, with the subsapce the identity of the quotient group.

If, as in relativity, [itex]V[/itex] and [itex]W[/itex] are both finite-dimensional spaces, then [itex]V \otimes W[/itex] is (naturally) isomorphic to the vector space of bilinear maps from [itex]V* \otimes W*[/itex] to [itex]\mathbb{R}[/itex]. For infinite-dimensional spaces [itex]V \otimes W[/itex] is isomorphic to a proper subspace of bilinear maps from [itex]V* \otimes W*[/itex] to [itex]\mathbb{R}[/itex]. Therefore, this space of bilinear mapping is often taken to be the tensor product space.
 
  • #14
Thanks guys. I haven't tried to understand all the details yet, but I will. George, that's a very good explanation.
 
  • #15
I just noticed this.

George Jones said:
If, as in relativity, [itex]V[/itex] and [itex]W[/itex] are both finite-dimensional spaces, then [itex]V \otimes W[/itex] is (naturally) isomorphic to the vector space of bilinear maps from [itex]V* \otimes W*[/itex] to [itex]\mathbb{R}[/itex]. For infinite-dimensional spaces [itex]V \otimes W[/itex] is isomorphic to a proper subspace of bilinear maps from [itex]V* \otimes W*[/itex] to [itex]\mathbb{R}[/itex]. Therefore, this space of bilinear mapping is often taken to be the tensor product space.

This should read:

If, as in relativity, [itex]V[/itex] and [itex]W[/itex] are both finite-dimensional spaces, then [itex]V \otimes W[/itex] is (naturally) isomorphic to the vector space of bilinear maps from [itex]V* \times W*[/itex] to [itex]\mathbb{R}[/itex]. For infinite-dimensional spaces [itex]V \otimes W[/itex] is isomorphic to a proper subspace of bilinear maps from [itex]V* \times W*[/itex] to [itex]\mathbb{R}[/itex]. Therefore, this space of bilinear mapping is often taken to be the tensor product space.
 
  • #16
eastside00_99 said:
It is easy to show, abstractly--i.e., without constructing the tensor product of V and W--that [itex] \tau [/itex] and X are unique.
I don't see how to prove the uniqueness. Maybe I'm just overlooking something really trivial. Is the idea to prove that the condition [itex]f=f_*\circ\tau=f'_*\circ\tau'[/itex] implies that there exists a linear bijection (i.e. a vector space isomorphism) from X' into X? How do I even begin?

It would be trivial if [itex]f_*[/itex] and [itex]f'_*[/itex] were bijections, but we haven't assumed that they are. Is that implied by something else?.
 
  • #17
George Jones said:
Consequently, use

[tex]e_{\left( v , w \right)} + e_{\left( \alpha v' , w \right)} - e_{\left( v + \alpha v' , w \right)}[/tex]

[tex]e_{\left( v , w \right)} + e_{\left( v , \alpha w' \right)} - e_{\left( v , w + \alpha w' \right)}[/tex]

to generate a subspace [itex]U[/itex] of [itex]F \left( V \times W \right)[/itex].

Then [itex]V \otimes W[/itex] is [itex]F \left( V \times W \right) / U[/itex].
I don't really understand this part. (I understand everything before it though, even the details you omitted). This is what you seem to be doing:

We choose one specific member of the field and two specific members of each vector space, and use them to define a two-dimensional subspace U of F(VxW). Then we define two members x and y of F(VxW) to be equivalent if x-y is in U. This means that the equivalence class that x belongs to is the set [x]={x+u|u in U}. Now we define the tensor product space as the set of all equivalence classes, with multiplication by a scalar and addition defined by a[x]=[ax], [x]+[y]=[x+y].

How can I use this to verify e.g. that

[tex](ax)\otimes y=a(x\otimes y)[/tex] ?​

I guess I would have to do it by showing that

[tex]e_{(ax,y)}-ae_{(x,y)}\in U[/tex]​

but that doesn't seem possible since x,y are completely unrelated to the specific v,v',w,w' used in the construction of U. :confused:
 
  • #18
Fredrik said:
I don't really understand this part. (I understand everything before it though, even the details you omitted). This is what you seem to be doing:

We choose one specific member of the field and two specific members of each vector space, and use them to define a two-dimensional subspace U of F(VxW). Then we define two members x and y of F(VxW) to be equivalent if x-y is in U. This means that the equivalence class that x belongs to is the set [x]={x+u|u in U}. Now we define the tensor product space as the set of all equivalence classes, with multiplication by a scalar and addition defined by a[x]=[ax], [x]+[y]=[x+y].

How can I use this to verify e.g. that

[tex](ax)\otimes y=a(x\otimes y)[/tex] ?​

I guess I would have to do it by showing that

[tex]e_{(ax,y)}-ae_{(x,y)}\in U[/tex]​

but that doesn't seem possible since x,y are completely unrelated to the specific v,v',w,w' used in the construction of U. :confused:

I used poor wording, and I wrote the relations down incorrectly. It should read:

Consequently, use *all* elements of the form

[tex]
e_{\left( v , w \right)} + \alpha e_{\left( v' , w \right)} - e_{\left( v + \alpha v' , w \right)}
[/tex]

[tex]
e_{\left( v , w \right)} + \alpha e_{\left( v , w' \right)} - e_{\left( v , w + \alpha w' \right)}
[/tex]

to generate (by taking all possible linear combinations of all such elements) a subspace [itex]U[/itex] of [itex]F \left( V \times W \right)[/itex]. In particular, any of [itex]v[/itex], [itex]v'[/itex], [itex]w[/itex], [itex]w'[/itex] can be zero.

Then [itex]V \otimes W[/itex] is [itex]F \left( V \times W \right) [/itex].

Note that if the field is [itex]\mathbb{R}[/itex] or [itex]\mathbb{C}[/itex], then [itex]U[/itex] is infinite-dimensional even if [itex]V[/itex] and [itex]W[/itex] are finite-dimensional.
 
  • #19
Fredrik said:
I don't see how to prove the uniqueness. Maybe I'm just overlooking something really trivial. Is the idea to prove that the condition [itex]f=f_*\circ\tau=f'_*\circ\tau'[/itex] implies that there exists a linear bijection (i.e. a vector space isomorphism) from X' into X? How do I even begin?

It would be trivial if [itex]f_*[/itex] and [itex]f'_*[/itex] were bijections, but we haven't assumed that they are. Is that implied by something else?.

It's easy because the tensor product is what is known as a universal repelling object in the category of Vector Spaces (or more generally, the category of Modules). In fact, you could reformulate the definition using terminology from category theory which is to be expected. Anyway, the way you do it is assume that P and P' both satisfy the definition as I recently gave before. Remember this means that there exist a bilinear maps [itex] \tau: V\times W \rightarrow P [/itex] and [itex] \tau^{\prime}: V\times W \rightarrow P' [/itex]. Since P and P' both satisfies the defintion, there must be a homomorphism f from P to P' such that [itex] f \circ \tau = \tau^{\prime} [/itex] and a homomorphism g from P' to P such that [itex] g \circ \tau^{\prime} = \tau [/itex] which implies [itex] f\circ g \circ \tau^{\prime}= f \circ \tau =\tau^{\prime} \implies f \circ g = Id_{P'} [/itex] and [itex] g \circ f \circ \tau^{\prime} = g \circ \tau = \tau^{\prime} \implies g \circ f = Id_{P} [/itex] This says that P and P' are isomorphic. So, perhaps it would be best to say the tensor product is unique up to isomorphism. Really, though it would be best to draw the commutative diagram(s) in the proof above and it will become immediately obvious that we have an isomorphism. I don't really feel like going through the effort to create the diagram and then posting it here though.
 
Last edited:
  • #20
eastside00_99 said:
It's easy because the tensor product is what is known as a universal repelling object in the category of Vector Spaces (or more generally, the category of Modules). In fact, you could reformulate the definition using terminology from category theory which is to be expected.
That doesn't sound so easy. :smile:

eastside00_99 said:
Since P and P' both satisfies the defintion, there must be a homomorphism f from P to P' such that [itex] f \circ \tau = \tau^{\prime} [/itex]
This is not at all obvious to me (even after drawing the diagram). I agree that there must exist a function [itex]f:P\rightarrow P'[/itex] of course, but I don't see how we can choose it so that [itex]f\circ\tau=\tau'[/itex] unless [itex]\tau[/itex] is injective.

If [itex]\tau[/itex] takes two different points of [itex]V\times W[/itex] to the same point [itex]p\in P[/itex], then we can't define f at that point by [itex]f(\tau(v,w))=\tau'(v,w)[/itex] because we don't know which (v,w) to use on the right-hand side.

It seems to me that this is hopeless unless we change the definition so that we require some or all of the functions to be injective.
 
  • #21
Fredrik said:
That doesn't sound so easy. :smile:This is not at all obvious to me (even after drawing the diagram). I agree that there must exist a function [itex]f:P\rightarrow P'[/itex] of course, but I don't see how we can choose it so that [itex]f\circ\tau=\tau'[/itex] unless [itex]\tau[/itex] is injective.

If [itex]\tau[/itex] takes two different points of [itex]V\times W[/itex] to the same point [itex]p\in P[/itex], then we can't define f at that point by [itex]f(\tau(v,w))=\tau'(v,w)[/itex] because we don't know which (v,w) to use on the right-hand side.

It seems to me that this is hopeless unless we change the definition so that we require some or all of the functions to be injective.

What I mean by easy is that the argument will be a standard one given by the idea of what a universal object is. There isn't any need to be inventive you might say.

The definition states that if P is a tensor product of VxW given by the bilinear function [itex] \tau [/itex] then given any other bilinear map f from VxW to some other vector space N then there exists a homomorphism f_* from P to N. So if we assume P and P' are both tensor products then that means we have two bilinear maps tau and tau^prime from VxW to P and P' respectively. So, in the above notation we have N=P' and f=tau^prime and we have a homomorphism f_* from P to P'. Applying the same argument by switching the places of P and P', we see that we have another homomorphism. These homomorphisms satisfy the identities I wrote down in my last post, and those identities tell us that P and P' are isomorphic.

You definitely do not need anything else other than the definition provided earlier to prove uniqueness. Its completely straightforward argument.
 
Last edited:
  • #22
eastside00_99 said:
This is very close.

Let V,W, X be vector spaces over F. Let [itex]\tau: V \times W \rightarrow X [/itex] be a bilinear map. We say that [itex]\tau[/itex] is a direct product if given any vector space Y over F and any bilinear map [itex] f: V \times W \rightarrow Y [/itex], then there exists a linear map [itex] f_{*}: X \rightarrow Y [/itex] such that [itex] f=f_{*}\circ\tau [/itex].

Oh, actually, there is one problem with this and that is that f_* needs to be unique...but now that is the correct definition and uniqueness of the tensor product will follows as in the above proof.
 
  • #23
Fredrik said:
That doesn't sound so easy. :smile:This is not at all obvious to me (even after drawing the diagram). I agree that there must exist a function [itex]f:P\rightarrow P'[/itex] of course, but I don't see how we can choose it so that [itex]f\circ\tau=\tau'[/itex] unless [itex]\tau[/itex] is injective.

If [itex]\tau[/itex] takes two different points of [itex]V\times W[/itex] to the same point [itex]p\in P[/itex], then we can't define f at that point by [itex]f(\tau(v,w))=\tau'(v,w)[/itex] because we don't know which (v,w) to use on the right-hand side.

Upon rereading your post, i think I see what you are saying...no it doesn't have to be injective. There is no problem for there is only one (v,w) in VxW, you choose that one. Again, the reason the [itex]f\circ\tau=\tau'[/itex] is that it ***is*** by the definition of the tensor product.
 
  • #24
I'm trying to post a reply, but I keep getting "database error" when I preview. I'll try to break it up in pieces and post them one at a time. (I'll let you know when I've posted the last piece).


eastside00_99 said:
The definition states that if P is a tensor product of VxW given by the bilinear function [itex] \tau [/itex] then given any other bilinear map f from VxW to some other vector space N then there exists a homomorphism f_* from P to N. So if we assume P and P' are both tensor products then that means we have two bilinear maps tau and tau^prime from VxW to P and P' respectively. So, in the above notation we have N=P' and f=tau^prime and we have a homomorphism f_* from P to P'. Applying the same argument by switching the places of P and P', we see that we have another homomorphism. These homomorphisms satisfy the identities I wrote down in my last post, and those identities tell us that P and P' are isomorphic.
I appreciate the effort, but you're really just repeating those parts I don't have a problem with, and skipping the part I do have a problem with. I'll try to rephrase my problem:

Suppose that X,Y,Z are sets and

[tex]f:X\rightarrow Y[/tex]
[tex]g:X\rightarrow Z[/tex]

The claim that I haven't been able to accept (that there exists a [itex]g:P\rightarrow P'[/itex] such that [itex]\tau'=g\circ\tau[/itex]) seems to be equivalent to the claim that in this case, there exists a function

[tex]h:f(X)\rightarrow g(X)[/tex]

such that [itex]g=h\circ f[/itex]. However, this is not true in general. To see that it isn't, suppose that [itex]x,x'\in X[/itex], [itex]x\neq x'[/itex] and that [itex]f(x)=f(x')[/itex] but [itex]g(x)\neq g(x')[/itex]. Then we have

[tex]g(x')\neq g(x)=h(f(x))=h(f(x'))=g(x')[/tex]

which is a contradiction.
 
  • #25
Returning to the problem at hand... You have changed the notation a little from post to post, so I'll explain the one I'm using right now:

[tex]\tau:V\times W\rightarrow P[/tex]
[tex]\tau':V\times W\rightarrow P'[/tex]

(Both of the taus are bilinear by definition). For each vector space N and each bilinear [itex]f:V\times W\rightarrow N[/itex], there exist linear functions

[tex]f_*:P\rightarrow N[/tex]
[tex]f'_*:P'\rightarrow N[/tex]

You say that these facts make it obvious that there exists a linear function [itex]g:P\rightarrow P'[/itex], such that [itex]\tau'=g\circ\tau[/itex], so you never explain why you think this function exists. That's the only step I have a problem with. I don't think it's obvious. In fact, I think it's impossible to know that it exists without knowing that [itex]f_*[/itex] is injective.

This is another way to look at it: Consider a diagram with arrows drawn between sets to represent the functions we've been talking about. There's no "path" from P to P' where the arrows all point in the same direction, so we can't construct g as a composition of some of the functions we have defined. And none of the functions have been assumed to be invertible, so we can't reverse the direction of any of the arrows.
 
  • #26
eastside00_99 said:
Upon rereading your post, i think I see what you are saying...no it doesn't have to be injective. There is no problem for there is only one (v,w) in VxW, you choose that one. Again, the reason the [itex]f\circ\tau=\tau'[/itex] is that it ***is*** by the definition of the tensor product.
I probably didn't express myself well enough there. I meant that if [itex]\tau[/itex] takes two different ordered pairs (v,w) and (v',w') to the same point in P (i.e. if [itex]\tau(v,w)=\tau(v',w')[/itex]), then we seem to have a problem. Hm, maybe it's only a problem if [itex]\tau'[/itex] takes the same two ordered pairs to two different points in P' (i.e. if [itex]\tau'(v,w)\neq\tau'(v',w')[/itex]).

(This is the last piece).
 
  • #27
Fredrick: use the fact that P is a tensor product.
 
  • #28
I would say that that's what I'm doing already. I'm trying to use the definition of "tensor product" that Eastside posted to prove that it's unique up to isomorphisms, but that's probably not what you meant.

Perhaps you meant something like this: In QM, a tensor product is used to construct the space of two-particle states from the space of one-particle states, and we certainly don't want [itex]|\alpha\rangle\otimes|\beta\rangle[/itex] to be equal to [itex]|\gamma\rangle\otimes|\delta\rangle[/itex]. To me this is just more evidence that I should go back and change Eastside's definition to require that [itex]\tau[/itex] be injective.
 
  • #29
Fredrik said:
I'm trying to post a reply, but I keep getting "database error" when I preview. I'll try to break it up in pieces and post them one at a time. (I'll let you know when I've posted the last piece).



I appreciate the effort, but you're really just repeating those parts I don't have a problem with, and skipping the part I do have a problem with. I'll try to rephrase my problem:

Suppose that X,Y,Z are sets and

[tex]f:X\rightarrow Y[/tex]
[tex]g:X\rightarrow Z[/tex]

The claim that I haven't been able to accept (that there exists a [itex]g:P\rightarrow P'[/itex] such that [itex]\tau'=g\circ\tau[/itex]) seems to be equivalent to the claim that in this case, there exists a function

[tex]h:f(X)\rightarrow g(X)[/tex]

such that [itex]g=h\circ f[/itex]. However, this is not true in general. To see that it isn't, suppose that [itex]x,x'\in X[/itex], [itex]x\neq x'[/itex] and that [itex]f(x)=f(x')[/itex] but [itex]g(x)\neq g(x')[/itex]. Then we have

[tex]g(x')\neq g(x)=h(f(x))=h(f(x'))=g(x')[/tex]

which is a contradiction.

But, all you are really saying is that if we have convince you that the direct product exists then you will know something about bilinear functions from VxW to other vector spaces for you supposed that g(x) =/= g(x').

The truth is that your map will hardly ever be injective. It will only be injective if V or W equals {0}. The rest of the time it will be a surjective bilinear map.
 
  • #30
Fredrik said:
You say that these facts make it obvious that there exists a linear function [itex]g:P\rightarrow P'[/itex], such that [itex]\tau'=g\circ\tau[/itex], so you never explain why you think this function exists. That's the only step I have a problem with. I don't think it's obvious. In fact, I think it's impossible to know that it exists without knowing that [itex]f_*[/itex] is injective.

We supposed it existed when we assumed that P and P' were both tensor products of V and W. That is the first step in the proof. So, we assumed that such maps existed from the onset. This is why the proof of uniqueness is so easy and can be done abstractly. Of course, it is all a mute point if the tensor product of V and W does not exist. The construction of the direct product will be the proof that it exists.
 
  • #31
You have a real rigorous side to you Fredrik. I like that, but if I may, I will outline all that we have done:

(1) Create a definition for an object which may or may not exist
(2) Proved abstractly that if it does exist, then it is unique up to isomorphism
(3) As you pointed out, that if it does exist, then it tells us something about all bilinear maps from VxW to any vector space Z

We began this discussion with the construction of the tensor product and I suggested we back up and give the abstract definition first. We have given the abstract definition so that now if we look at the construction we will have proved existence and we will automatically have the conclusions of 2 and 3. This should be enough to convince you that the definition will work if we prove existence.
 
  • #32
Fredrik said:
I would say that that's what I'm doing already. I'm trying to use the definition of "tensor product" that Eastside posted to prove that it's unique up to isomorphisms, but that's probably not what you meant.
The existence of the map you seek is his definition of the tensor product... (at least, the one he gave in post #2)
 
  • #33
First a recap.

The tensor product of vector spaces [itex]V[/itex] and [itex]W[/itex] is a vector space [itex]V \otimes W[/itex] together with a bilinear mapping [itex]\otimes : V \times W \rightarrow V \otimes W[/itex], such that if [itex]X[/itex] is a vector space and [itex]\tau[/itex] is a bilinear mapping from [itex]V \times W[/itex] into [itex]X[/itex], then there exists a unique linear mapping [itex]f[/itex] from [itex]V \otimes W[/itex] into [itex]X[/itex] such that [itex]\tau = f \circ \otimes[/itex].

Note that, as eastside00_99 has said, the existence of such a unique [itex]f[/itex] is part of the definition of tensor product.

Let [itex]V \otimes W[/itex] together with a bilinear mapping [itex]\otimes : V \times W \rightarrow V \otimes W[/itex] be a tensor product.

Let [itex]V \square W[/itex] together with a bilinear mapping [itex]\square : V \times W \rightarrow V \square W[/itex] also be a tensor product.

Now, to prove uniqueness, use this definition of tensor product four times.

First time: take the tensor product to be [itex]V \otimes W[/itex], [itex]X[/itex] to be [itex]V \square W[/itex] with [itex]\tau = \square[/itex]. The definition says that there is unique linear mapping [itex]f[/itex] from [itex]V \otimes W[/itex] into [itex]V \square W[/itex] such that [itex]\square = f \circ \otimes[/itex].

Second time: take the tensor product to be [itex]V \square W[/itex], [itex]X[/itex] to be [itex]V \otimes W[/itex] with [itex]\tau = \otimes[/itex]. The definition says that there is unique linear mapping [itex]g[/itex] from [itex]V \square W[/itex] into [itex]V \otimes W[/itex] such that [itex]\otimes = g \circ \square[/itex].

Combining these gives [itex]\otimes = g \circ f \circ \otimes[/itex] and [itex]\square = f \circ g \circ \square[/itex].

Third time: take the tensor product to be [itex]V \otimes W[/itex], [itex]X[/itex] to be [itex]V \otimes W[/itex] with [itex]\tau = \otimes[/itex]. The definition says that there is unique linear mapping [itex]h[/itex] from [itex]V \otimes W[/itex] into [itex]V \otimes W[/itex] such that [itex]\otimes = h \circ \otimes[/itex]. [itex]h = I_{V \otimes W}[/itex] (the identity on [itex]V \otimes W[/itex]) cleary works. But, since (as above), [itex]\otimes = g \circ f \circ \otimes[/itex], [itex]h = g \circ f[/itex] also works. Because, by definition, [itex]h[/itex] is unique, [itex]I_{V \otimes W} = g \circ f[/itex].

Fourth time: take the tensor product to be [itex]V \square W[/itex], [itex]X[/itex] to be [itex]V \square W[/itex] with [itex]\tau = \square[/itex]. Use the definition to show [itex]I_{V \square W} = f \circ g[/itex].

Therefore, [itex]f : V \otimes W \rightarrow V \square W[/itex] and [itex]g : V \square W \rightarrow V \otimes W[/itex] are both isomorphisms, and [itex]g = f^{-1}[/itex].
 
  • #34
Thank you George. That cleared up a lot. Now I understand everything except a detail in the construction, but I'll give that some more thought later today.

Actually all that anyone would have needed to say to get me past my main concern is what you said here:
George Jones said:
take [...] [itex]X[/itex] to be [itex]V \square W[/itex]
I had a feeling I was overlooking something very simple, but for some reason I couldn't see that all I needed to do was to let the "alternative" tensor product be the arbitrary bilinear function mentioned in the definition of the tensor product.

I also appreciate that your post explained why we assume that the functions you called [itex]\tau[/itex] (the ones eastside called [itex]f_*[/itex]) are unique. (Because it allows us to identify [itex]g\circ f[/itex] and [itex]f\circ g[/itex] with the appropriate identity maps).

I also realized that no bilinear map can be injective, since we want things like B(ax,y)=B(x,ay) to hold.
 

What is a tensor product of vector spaces?

A tensor product of vector spaces is a mathematical operation that combines two vector spaces to create a new vector space. It is denoted by the symbol ⊗ and is used to represent the outer product of two vectors.

How is a tensor product different from a regular product of vectors?

A regular product of vectors results in a scalar, while a tensor product results in a new vector space. Additionally, the tensor product is commutative, meaning the order of the vectors does not matter, while the regular product is not commutative.

What are some real-world applications of tensor products?

Tensor products are commonly used in physics and engineering, particularly in the study of electromagnetism and quantum mechanics. They are also used in computer science for data compression and in machine learning for feature extraction.

How is a tensor product related to the concept of tensor fields?

A tensor product is used to construct tensor fields, which are mathematical objects that assign a tensor to each point in a given space. Tensor fields are used in physics to describe physical quantities that vary over space and time.

Can a tensor product be applied to more than two vector spaces?

Yes, a tensor product can be applied to any number of vector spaces. For example, the tensor product of three vector spaces would be denoted by V₁⊗V₂⊗V₃. The resulting tensor would have a higher order and represent a more complex relationship between the original vector spaces.

Similar threads

  • Linear and Abstract Algebra
Replies
10
Views
241
  • Linear and Abstract Algebra
Replies
32
Views
3K
  • Linear and Abstract Algebra
Replies
3
Views
181
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
863
  • Linear and Abstract Algebra
Replies
1
Views
1K
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
Back
Top