The Uniqueness of a Tensor Product

Click For Summary
SUMMARY

The discussion centers on the uniqueness of tensor products as presented in Bruce N. Cooperstein's "Advanced Linear Algebra (Second Edition)." Specifically, it addresses Lemma 10.1, which asserts that the tensor product of vector spaces is unique up to isomorphism. Participants confirm that if linear maps \( S \) and \( T \) satisfy \( ST = I_V \) and \( TS = I_Z \), then the vector spaces \( V \) and \( Z \) are indeed isomorphic. The conversation also explores the implications of defining a tensor product and the existence of unique linear maps corresponding to multilinear maps.

PREREQUISITES
  • Understanding of vector spaces and linear maps
  • Familiarity with multilinear maps and bilinear maps
  • Knowledge of isomorphisms in linear algebra
  • Basic concepts of tensor products as defined in linear algebra
NEXT STEPS
  • Study the proof of Lemma 10.1 in Cooperstein's "Advanced Linear Algebra"
  • Learn about the universal mapping property of tensor products
  • Explore the construction and properties of bilinear maps
  • Investigate examples of tensor products, such as \( \mathbb{R}^2 \otimes \mathbb{R}^2 \)
USEFUL FOR

Mathematicians, students of linear algebra, and educators seeking a deeper understanding of tensor products and their properties in vector spaces.

Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Bruce N. Coopersteins book: Advanced Linear Algebra (Second Edition) ... ...

I am focused on Section 10.1 Introduction to Tensor Products ... ...

I need help with the proof of Lemma 10.1 on the uniqueness of a tensor product ... ... Before proving the uniqueness (up to an isomorphism) of a tensor product, Cooperstein defines a tensor product ... as follows:View attachment 5380Cooperstein then states that he is going to show that the tensor product is essentially unique (up to an isomorphism) ... and then presents Lemma 10.1 ... ... as follows ... ...
https://www.physicsforums.com/attachments/5381My questions are as follows:Question 1

I am assuming that the point of the Theorem is to show that $$V$$ is isomorphic to $$Z$$ and, I think, also that $$\gamma$$ is isomorphic to $$\delta$$ ... ... is that correct?

Now I am assuming that once we have shown that $$ST = I_V$$ ... ... and $$TS = I_Z$$ ... ...

then we have shown that $$S$$ (and $$T$$ for that matter) is a bijection ... and hence $$V$$ and $$Z$$ are isomorphic ... is that correct?

... ... BUT ... ...... what exactly are we showing if we demonstrate that $$T \gamma = \delta$$ ... ... and also that $$S \delta = \gamma$$ ... ... unless it is that, since from (i), $$S$$ and $$T$$ are bijections then $$\delta$$ and $$ \gamma$$ are isomorphic

... ... is that correct?
Question 2

In Cooperstein's proof of Lemma 10.1 we read:

" ... ... Since $$(V, \gamma)$$ is a tensor product of $$V_1, \ ... \ ... \ , \ V_m$$ over $$\mathbb{F}$$

and $$\delta$$ is a multilinear map from $$V_1, \ ... \ ... \ , \ V_m $$ to $$Z$$, then there exists a

unique linear map $$T \ : \ V \longrightarrow Z$$ such that $$T \gamma = \delta$$ ... ... "Can someone please explain to me why the above statement is true ... ...Peter==========================================================**** EDIT ****

I have been reflecting on Question 2 ...

... Now ... in Cooperstein's definition of the tensor product ... we read ...
" ... a pair $$(V , \gamma)$$ consisting of a vector space $$V$$ over $$\mathbb{F}$$

and a multilinear map $$\gamma \ : \ V_1 \times \ ... \ \times

V_m \longrightarrow V$$ is a tensor product of

$$V_1, \ ... \ ... \ , \ V_m$$ over $$\mathbb{F}$$ if, whenever $$W$$ is a vector space over $$\mathbb{F}$$

and $$f \ : \ V_1 \times \ ... \ \times V_m \longrightarrow W$$ is a multilinear map,

then there exists a unique bilinear map $$T \ : \ V \longrightarrow W$$ such that $$T \circ \gamma = f$$ ... ... "
Well ... presumably $$Z$$ can stand in for $$W$$ and $$\delta$$ can stand in for $$f$$ ... ... which gives the situation shown in Figure 1 below ... ...

View attachment 5382

The conditions of the definition of the tensor product imply there exists a unique linear map $$T \ : \ V \longrightarrow Z$$ such that $$T \circ \gamma = \delta$$ ... ...

Is that correct? Can someone please confirm ... or point out errors and shortcomings ... ?

Peter
 
Last edited:
Physics news on Phys.org
Let me give an example of a tensor product of $\Bbb R^2 \otimes \Bbb R^2$, namely: $\Bbb R^4$.

Define: $(a,b)\otimes(c,d) = (ac,ad,bc,bd)$.

It is readily seen that:
$e_1\otimes e_1 = (1,0,0,0)$
$e_1\otimes e_2 = (0,1,0,0)$
$e_2\otimes e_1 = (0,0,1,0)$
$e_2 \otimes e_2 = (0,0,0,1)$, and that this is a basis for $\Bbb R^4$.

We verify that the map $((a,b),(c,d)) \mapsto (a,b)\otimes(c,d)$ is bilinear:

$[(a,b) + (a',b')]\otimes(c,d) = (a+a',b+b')\otimes(c,d) = ((a+a')c,(a+a')d,(b+b')c,(b+b')d)$

$= (ac+a'c,ad+a'd,bc+b'c,bd+b'd) = (ac,ad,bc,bd) + (a'c,a'd,b'c,b'd) = (a,b)\otimes(c,d) + (a',b')\otimes(c,d)$

If $r \in \Bbb R$:

$(r(a,b))\otimes(c,d) = (ra,rb)\otimes(c,d) = ((ra)c,(ra)d,(rb)c,(rb)d) = (r(ac),r(ad),r(bc),r(bd))$

$= r(ac,ad,bc,bd) = r((a,b)\otimes(c,d))$

$(a,b)\otimes[(c,d) + (c',d')] = (a,b)\otimes(c+c',d+d') = (a(c+c'),a(d+d'),b(c+c'),b(d+d'))$

$= (ac+ac',ad+ad',bc+bc',bd+bd') = (ac,ad,bc,bd) + (ac',ad',bc',bd') = (a,b)\otimes(c,d) + (a,b)\otimes(c',d')$, and:

$(a,b)\otimes(r(c,d)) = (a,b)\otimes(rc,rd) = (a(rc),a(rd),b(rc),b(rd)) = (r(ac),r(ad),r(bc),r(bd))$

$= r(ac,ad,bc,bd) = r((a,b)\otimes(c,d))$

(As you can see by this last string of equalities (linearity in the second element of $\Bbb R^2$), commutativity of $\Bbb R$'s multiplication is crucial...).

So now we have a vector space ($\Bbb R^4$), and a bilinear map $\otimes: \Bbb R^2 \times \Bbb R^2 \to \Bbb R^4$.

Now suppose $B:\Bbb R^2 \times \Bbb R^2 \to V$ is a bilinear map. How should we define the linear map:

$L:\Bbb R^4 \to V$, such that:

$L \circ \otimes = B$?

Since $\Bbb R^4$ is a vector space, it suffices to define $L$ for $e_i \otimes e_j$ ($i,j = 1,2$).

So we "do the obvious thing" and define:

$L(e_i\otimes e_j) = B(e_i,e_j)$.

Since this might seem like cheating, let's see how this plays out for some definite $B$ and $V$.

We'll use $V = \Bbb R$, and $B(x,y) = \langle x,y\rangle$, the usual dot-product of $\Bbb R^2$:

$B((x,y),(x',y')) = xx'+yy'$.

We have four specific instances of $B$ to compute, first:

$\langle e_1,e_1\rangle = 1$
$\langle e_1,e_2\rangle = 0$
$\langle e_2,e_1\rangle = 0$
$\langle e_2,e_2\rangle = 1$.

Now, by linearity:

$L(a,b,c,d) = L(a(1,0,0,0) + b(0,1,0,0) + c(0,0,1,0) + d(0,0,0,1))$

$= aL(1,0,0,0) + bL(0,1,0,0) + cL(0,0,1,0) + dL(0,0,0,1)$

$ = aL(e_1\otimes e_1) + bL(e_1\otimes e_2) + cL(e_2\otimes e_1) + dL(e_2\otimes e_2)$

$ = a + d$.

In particular, we have:

$(L \circ \otimes)((x,y),(x',y')) = L((x,y)\otimes(x',y')) = L(xx',xy',yx',yy') = xx' + yy' = B((x,y),(x',y'))$.

Now, for all practical purposes, the tensor product here is the "outer product" of $(a,b)$ and $(c,d)$, and what it does to the "inner product" is turn it into the linear function, the TRACE of the 2x2 matrix we would get if we saw $\Bbb R^2 \otimes \Bbb R^2$ as the mapping:

$((a,b),(c,d)) \mapsto \begin{bmatrix}ac&ad\\bc&bd\end{bmatrix}$.

There is, to a mathematician, a satisfying symmetry here: my example, shows EXISTENCE of a tensor product of $\Bbb R^2 \otimes \Bbb R^2$, trivially, and what has to be proven is the universal mapping property. Cooperstein's construction makes the universal mapping property trivial (by definition), and what has to be proven is existence.
 
Deveno said:
Let me give an example of a tensor product of $\Bbb R^2 \otimes \Bbb R^2$, namely: $\Bbb R^4$.

Define: $(a,b)\otimes(c,d) = (ac,ad,bc,bd)$.

It is readily seen that:
$e_1\otimes e_1 = (1,0,0,0)$
$e_1\otimes e_2 = (0,1,0,0)$
$e_2\otimes e_1 = (0,0,1,0)$
$e_2 \otimes e_2 = (0,0,0,1)$, and that this is a basis for $\Bbb R^4$.

We verify that the map $((a,b),(c,d)) \mapsto (a,b)\otimes(c,d)$ is bilinear:

$[(a,b) + (a',b')]\otimes(c,d) = (a+a',b+b')\otimes(c,d) = ((a+a')c,(a+a')d,(b+b')c,(b+b')d)$

$= (ac+a'c,ad+a'd,bc+b'c,bd+b'd) = (ac,ad,bc,bd) + (a'c,a'd,b'c,b'd) = (a,b)\otimes(c,d) + (a',b')\otimes(c,d)$

If $r \in \Bbb R$:

$(r(a,b))\otimes(c,d) = (ra,rb)\otimes(c,d) = ((ra)c,(ra)d,(rb)c,(rb)d) = (r(ac),r(ad),r(bc),r(bd))$

$= r(ac,ad,bc,bd) = r((a,b)\otimes(c,d))$

$(a,b)\otimes[(c,d) + (c',d')] = (a,b)\otimes(c+c',d+d') = (a(c+c'),a(d+d'),b(c+c'),b(d+d'))$

$= (ac+ac',ad+ad',bc+bc',bd+bd') = (ac,ad,bc,bd) + (ac',ad',bc',bd') = (a,b)\otimes(c,d) + (a,b)\otimes(c',d')$, and:

$(a,b)\otimes(r(c,d)) = (a,b)\otimes(rc,rd) = (a(rc),a(rd),b(rc),b(rd)) = (r(ac),r(ad),r(bc),r(bd))$

$= r(ac,ad,bc,bd) = r((a,b)\otimes(c,d))$

(As you can see by this last string of equalities (linearity in the second element of $\Bbb R^2$), commutativity of $\Bbb R$'s multiplication is crucial...).

So now we have a vector space ($\Bbb R^4$), and a bilinear map $\otimes: \Bbb R^2 \times \Bbb R^2 \to \Bbb R^4$.

Now suppose $B:\Bbb R^2 \times \Bbb R^2 \to V$ is a bilinear map. How should we define the linear map:

$L:\Bbb R^4 \to V$, such that:

$L \circ \otimes = B$?

Since $\Bbb R^4$ is a vector space, it suffices to define $L$ for $e_i \otimes e_j$ ($i,j = 1,2$).

So we "do the obvious thing" and define:

$L(e_i\otimes e_j) = B(e_i,e_j)$.

Since this might seem like cheating, let's see how this plays out for some definite $B$ and $V$.

We'll use $V = \Bbb R$, and $B(x,y) = \langle x,y\rangle$, the usual dot-product of $\Bbb R^2$:

$B((x,y),(x',y')) = xx'+yy'$.

We have four specific instances of $B$ to compute, first:

$\langle e_1,e_1\rangle = 1$
$\langle e_1,e_2\rangle = 0$
$\langle e_2,e_1\rangle = 0$
$\langle e_2,e_2\rangle = 1$.

Now, by linearity:

$L(a,b,c,d) = L(a(1,0,0,0) + b(0,1,0,0) + c(0,0,1,0) + d(0,0,0,1))$

$= aL(1,0,0,0) + bL(0,1,0,0) + cL(0,0,1,0) + dL(0,0,0,1)$

$ = aL(e_1\otimes e_1) + bL(e_1\otimes e_2) + cL(e_2\otimes e_1) + dL(e_2\otimes e_2)$

$ = a + d$.

In particular, we have:

$(L \circ \otimes)((x,y),(x',y')) = L((x,y)\otimes(x',y')) = L(xx',xy',yx',yy') = xx' + yy' = B((x,y),(x',y'))$.

Now, for all practical purposes, the tensor product here is the "outer product" of $(a,b)$ and $(c,d)$, and what it does to the "inner product" is turn it into the linear function, the TRACE of the 2x2 matrix we would get if we saw $\Bbb R^2 \otimes \Bbb R^2$ as the mapping:

$((a,b),(c,d)) \mapsto \begin{bmatrix}ac&ad\\bc&bd\end{bmatrix}$.

There is, to a mathematician, a satisfying symmetry here: my example, shows EXISTENCE of a tensor product of $\Bbb R^2 \otimes \Bbb R^2$, trivially, and what has to be proven is the universal mapping property. Cooperstein's construction makes the universal mapping property trivial (by definition), and what has to be proven is existence.
Thanks for the help and support Deveno ... ...

Just working through the details of your post now ...

Peter
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 28 ·
Replies
28
Views
4K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K