MHB Tensor Products - Basic Understanding of Cooperstein, Theorem 10.2

  • Thread starter Thread starter Math Amateur
  • Start date Start date
  • Tags Tags
    Tensor Theorem
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Bruce N. Coopersteins book: Advanced Linear Algebra (Second Edition) ... ...

I am focused on Section 10.1 Introduction to Tensor Products ... ...

I need help in order to get a basic understanding of Theorem 10.2 regarding the basis of a tensor product ... ...

My apologies if my previous questions on this matter are similar to earlier questions ... but I have been puzzling over this theorem for some time ... and still do not have a full understanding of it ...Theorem 10.2 reads as follows:

View attachment 5520My questions are as follows:

1. What is the nature/form of the elements of $$X'$$ and $$X$$ and how are they related to each other ...

2. What are the nature/form of the elements of $$Z$$ and $$Z'$$ and how are they related to each other ... and further, what is the form of the non-basis elements of $$Z$$ and $$Z'$$ ...

I know I have not formulated the question very precisely ... ... :( ... ... but nonetheless I hope someone is able to help ...

Peter============================================================

*** EDIT ***

I have been reflecting on my own questions above ... here are my thoughts ...

Elements of $$X'$$ would be of the form

$$x = (v_1, v_2, \ ... \ ... \ , v_m)$$ with $$v_i \in \mathcal{B}_i$$Elements of $$X = V_1 \times V_2 \times \ ... \ ... \ \times V_m$$ would be of the form

$$x = (v_1, v_2, \ ... \ ... \ , v_m)$$ with $$v_i \in V_i $$and ... ...

... since I imagine $$\mathcal{B}_i \subseteq V_i$$ ... then we have $$X' \subseteq X$$ ... ... (Now ... can we say any more about the form of the elements of X' and X?

Is the above all we can say? )
Now, before outlining the form of the elements of $$Z'$$ ... we just note that we are asked to identify each element $$x = (v_1, v_2, \ ... \ ... \ , v_m) \in X' $$ with $$\chi_x \in Z'$$ ... ...Now, $$Z'$$ is a vector space over the field $$\mathbb{F}$$, so there will be an operation of addition of elements of Z' and a scalar multiplication ... ...

So ... if $$x_1 = (v_{11}, v_{21}, \ ... \ ... \ , v_{m1}) \in X'$$ and if $$c_1 \in \mathbb{F}$$ ... ...

... then $$c_1 \chi_{x_1} \in Z'$$Similarly $$c_2 \chi_{x_2} \in Z'$$ and so on ...

So, by operations of addition we can form elements of the form

$$c_1 \chi_{x_1} + c_2 \chi_{x_2} + \ ... \ ... \ + c_n \chi_{x_n}$$ ... ... ... ... ... (1)


... and (1) above is the general form of elements in Z' ...


If we then identify $$c_i \chi_{x_i}$$ with $$x_i$$ we can view the elements of $$Z'$$ as $$c_1 (v_{11}, v_{21}, \ ... \ ... \ , v_{m1}) + c_2 (v_{12}, v_{22}, \ ... \ ... \ , v_{m2}) + \ ... \ ... \

+ c_n (v_{1n}, v_{2n}, \ ... \ ... \ , v_{mn})$$
BUT ... THEN ... what form do the elements of $$Z$$ have ... especially those that are in $$Z$$ but not in $$Z'$$ ... ... ?Can someone please critique my analysis ... and comment on the elements of $$Z$$ ... especially those not in $$Z'$$ ... ...
========================================================

NOTE:

The early pages of Cooperstein's Section 10.1 give the notation and approach to understanding tensor products and hence to understanding the notation and concepts used in Theorem 10.2 ... ... hence I am providing the first few pages of Section 10.1 as follows:
https://www.physicsforums.com/attachments/5521
View attachment 5522
View attachment 5523
View attachment 5524

http://mathhelpboards.com/attachments/linear-abstract-algebra-14/5524-tensor-products-basic-understanding-cooperstein-theorem-10-2-a-cooperstein-4-section-10-1-part-4-png
 
Last edited:
Physics news on Phys.org
Let's use a slightly different notation, so it's easier to track what we're doing.

In what follows, a lower case Roman letter will designate a vector, and a lower case Greek letter will represent a scalar.

We'll further restrict ourselves to the bilinear case of two factors. You may find it an interesting investigation to show that:

$V_1 \otimes (V_2 \otimes V_3) \cong (V_1 \otimes V_2) \otimes V_3$, which will mean that we can extend to an arbitrary multilinear case "one factor at a time".

So we have two vector spaces, $V$ and $W$ (this way I don't have to write annoying subscripts). I will write the (annoyingly huge) free vector space based on the SET $X = V \times W$, as $Z = F(V \times W)$.

The subspace of $Z$ of interest to us is:

$U = \text{span}(S)$ where:

$S = \{(v+v',w)-(v,w)-(v',w),(v,w+w')-(v,w)-(v,w'),\alpha(v,w)-(\alpha v,w),\alpha(v,w)-(v,\alpha w): \alpha \in F, v,v'\in V,w,w' \in W\}$

Now $U$ is pretty big, too (there's a lot of generators, and even more linear combinations of these), so we might hope that the "hugeness" cancels, and $Z/U$ is of a more manageable size. How do we measure the "size" of a vector space? By it's dimension, by counting a basis ("counting" is a bit of a stretch, here-perhaps "enumerate","measure" or even "describe" might be a bit more appropriate).

We're going to DEFINE $V \otimes W$ as $Z/U$, and write $v\otimes w$ for the coset $(v,w) + U$.

Claim number 1:

$(v,w) \mapsto v\otimes w$ is a blliear map $V \times W \to Z/U$.

This means showing:

$(v + v')\otimes w = v\otimes w + v' \otimes w$
$v \otimes (w + w') = v \otimes w + v\otimes w'$
$(\alpha v)\otimes w = v\otimes \alpha w = \alpha (v \otimes w)$

I'll show how we prove the first one (the others are similar).

$(v + v')\otimes w = v\otimes w + v' \otimes w$ means:

$(v+v',w) + U = (v,w) + U + (v',w) + U$ (this is how we defined $\otimes$ above).

Now the RHS is (by the definition of coset addition):

$(v,w) + (v',w) + U$. Now $x + U = y + U$ iff $x - y \in U$. So we need to show that:

$(v+v',w) - (v,w) - (v',w) \in U$. But this is in $S$ (by the way we defined $S$), and thus in $U$.

Claim number 2:

If $Y$ is *any* vector space, and $B: V \times W \to Y$ is *any* bilinear map, there exists a unique (the one and only!) linear map $L: Z/U \to Y$ with $B = L \circ\otimes$.

Well, the "natural" thing to do, is DEFINE:

$L(\sum\limits_i \alpha_i(v_i,w_i) + U) = \sum\limits_i \alpha_i B(v_i,w_i)$.

But...there's a "catch", $L$ is defined via the $v_i$ and $w_i$, and so we need to confirm it is "well-defined" (constant on cosets).

That is, we need to verify that if $\sum\limits_i \alpha_i(v_i,w_i) +U = \sum\limits_j\beta_j(v_j,w_j) + U$, that:

$\sum\limits_i \alpha_iB(v_i,w_i) = \sum\limits_j \beta_jB(v_j,w_j)$.

But the former only happens if $\sum\limits_i \alpha_i(v_i,w_i) - \sum\limits_j \beta_j(v_j,w_j) \in U$.

It suffices to prove this for elements of $S$, since we can just take linear combinations of these to string together "the whole proof" (hint: since $L$ is defined on linear combinations of elements by taking linear combinations of the images of $B$, it is obvious that $L$ is linear). I shall prove it when the difference is of the form:

$(v+v',w) - (v,w) - (v',w)$, you can investigate it for the other three generating forms, and how to use linear combinations to prove it for anything in $U$.

Now if $\sum\limits_i \alpha_i(v_i,w_i) - \sum\limits_j \beta_j(v_j,w_j) = (v+v',w) - (v,w) - (v',w)$

we have $\sum\limits_i \alpha_iB(v_i,w_i) - \sum\limits_j \beta_jB(v_j,w_j) = B(v+v',w) - B(v,w) - B(v',w)$

(this is just applying $L$ to both sides)

$= B(v+v',w) - [B(v,w) + B(v',w)] = B(v+v',w) - B(v+v',w) = 0$ (since $B$ is bilinear), so

$\sum\limits_i \alpha_iB(v_i,w_i) = \sum\limits_j \beta_jB(v_j,w_j)$.

Claims 1 & 2 show we have "a" construction satisfying the UMP of a tensor product, justifying our use of $\otimes$.

So we have shown existence (but not uniqueness up to isomorphism, but that part is easy) of a tensor product with the desired UMP. Basically, the algebraic strategy here, is to "forget how we got here", and just use the UMP. But we'd like to get a handle on "how big our tensor product vector space is". So we need to describe a basis.

Well, the only thing we have that is suitable for use as a basis, are the bases we know exist for $V$ and $W$.

Let's call these bases:

$\mathcal{B}_1 = \{e_1,\dots,e_n\}$ and $\mathcal{B_2} = \{f_1,\dots,f_m\}$.

We've established we have a bilinear map $V \times W \to V \otimes W$ which sends $(v,w) \mapsto v\otimes w$.

We thus can *restrict* this map to the subset $\mathcal{B}_1 \times \mathcal{B}_2$.

So let's investigate what $Z' = \text{span}(\otimes(\mathcal{B}_1 \times \mathcal{B}_2))$ is.

That is, which elements of $V \otimes W$ can we reach by considering the linear combinations:

$\sum\limits_k \gamma_k (e_{i_k}\otimes f_{j_k})$?

Let's look at the "$V$" side, first.

We have $v = \sum\limits_i \alpha_ie_i$.

Thus for any $f_j$ we have (by the bilinearity of $\otimes$):

$v\otimes f_j = (\sum\limits_i \alpha_ie_i)\otimes f_j = \sum\limits_i \alpha_i (e_i \otimes f_j)$.

Similarly, we can write $w = \sum\limits_j \beta_j f_j$, and thus by bilinearity:

$v \otimes w = v\otimes (\sum\limits_j \beta_j f_j) = \sum\limits_j \beta_j (v \otimes f_j)$.

Hence:

$v \otimes w = \sum\limits_i\sum\limits_j \alpha_i\beta_j (e_i \otimes f_j)$

which is in our span of (simple) tensors of basis vectors; that is $Z' = Z/U$

(this is the same sum as above if we re-number the $(i,j)$ pairs to be indexed by $k$).

Thus $\dim(V \otimes W) = \dim(V)\dim(W)$.

************************

Finally, after all this, an "easy" example. Let us examine what $\Bbb R \otimes \Bbb R$ could possibly be.

First of all, note that:

$m(x,y) = xy$ is a bilinear map (the expression of its bilinearity is known as the distributive, associative, and commutative laws of multiplication).

Let us suppose we have a bilinear map $B: \Bbb R \times \Bbb R \to V$, for some vector space $V$.

Let us say that $B(1,1) = v_0$. It follows from bilinearity that:

$B(x,y) = xB(1,y) = xyB(1,1) = (xy)v_0$.

Define $L: \Bbb R \to V$ by $L(a) = av_0$.This is clearly a linear map:

$L(a+b) = (a+b)v_0 = av_0 + bv_0 = L(a) + L(b)$
$L(ra) = (ra)v_0 = r(av_0) = rL(a)$;

and we have $B(x,y) = L(xy) = L(m(x,y))$, that is: $B = L \circ m$.

We therefore conclude that $\Bbb R \otimes \Bbb R \cong \Bbb R$, with the tensor product being ordinary real multiplication.
 
Deveno said:
Let's use a slightly different notation, so it's easier to track what we're doing.

In what follows, a lower case Roman letter will designate a vector, and a lower case Greek letter will represent a scalar.

We'll further restrict ourselves to the bilinear case of two factors. You may find it an interesting investigation to show that:

$V_1 \otimes (V_2 \otimes V_3) \cong (V_1 \otimes V_2) \otimes V_3$, which will mean that we can extend to an arbitrary multilinear case "one factor at a time".

So we have two vector spaces, $V$ and $W$ (this way I don't have to write annoying subscripts). I will write the (annoyingly huge) free vector space based on the SET $X = V \times W$, as $Z = F(V \times W)$.

The subspace of $Z$ of interest to us is:

$U = \text{span}(S)$ where:

$S = \{(v+v',w)-(v,w)-(v',w),(v,w+w')-(v,w)-(v,w'),\alpha(v,w)-(\alpha v,w),\alpha(v,w)-(v,\alpha w): \alpha \in F, v,v'\in V,w,w' \in W\}$

Now $U$ is pretty big, too (there's a lot of generators, and even more linear combinations of these), so we might hope that the "hugeness" cancels, and $Z/U$ is of a more manageable size. How do we measure the "size" of a vector space? By it's dimension, by counting a basis ("counting" is a bit of a stretch, here-perhaps "enumerate","measure" or even "describe" might be a bit more appropriate).

We're going to DEFINE $V \otimes W$ as $Z/U$, and write $v\otimes w$ for the coset $(v,w) + U$.

Claim number 1:

$(v,w) \mapsto v\otimes w$ is a blliear map $V \times W \to Z/U$.

This means showing:

$(v + v')\otimes w = v\otimes w + v' \otimes w$
$v \otimes (w + w') = v \otimes w + v\otimes w'$
$(\alpha v)\otimes w = v\otimes \alpha w = \alpha (v \otimes w)$

I'll show how we prove the first one (the others are similar).

$(v + v')\otimes w = v\otimes w + v' \otimes w$ means:

$(v+v',w) + U = (v,w) + U + (v',w) + U$ (this is how we defined $\otimes$ above).

Now the RHS is (by the definition of coset addition):

$(v,w) + (v',w) + U$. Now $x + U = y + U$ iff $x - y \in U$. So we need to show that:

$(v+v',w) - (v,w) - (v',w) \in U$. But this is in $S$ (by the way we defined $S$), and thus in $U$.

Claim number 2:

If $Y$ is *any* vector space, and $B: V \times W \to Y$ is *any* bilinear map, there exists a unique (the one and only!) linear map $L: Z/U \to Y$ with $B = L \circ\otimes$.

Well, the "natural" thing to do, is DEFINE:

$L(\sum\limits_i \alpha_i(v_i,w_i) + U) = \sum\limits_i \alpha_i B(v_i,w_i)$.

But...there's a "catch", $L$ is defined via the $v_i$ and $w_i$, and so we need to confirm it is "well-defined" (constant on cosets).

That is, we need to verify that if $\sum\limits_i \alpha_i(v_i,w_i) +U = \sum\limits_j\beta_j(v_j,w_j) + U$, that:

$\sum\limits_i \alpha_iB(v_i,w_i) = \sum\limits_j \beta_jB(v_j,w_j)$.

But the former only happens if $\sum\limits_i \alpha_i(v_i,w_i) - \sum\limits_j \beta_j(v_j,w_j) \in U$.

It suffices to prove this for elements of $S$, since we can just take linear combinations of these to string together "the whole proof" (hint: since $L$ is defined on linear combinations of elements by taking linear combinations of the images of $B$, it is obvious that $L$ is linear). I shall prove it when the difference is of the form:

$(v+v',w) - (v,w) - (v',w)$, you can investigate it for the other three generating forms, and how to use linear combinations to prove it for anything in $U$.

Now if $\sum\limits_i \alpha_i(v_i,w_i) - \sum\limits_j \beta_j(v_j,w_j) = (v+v',w) - (v,w) - (v',w)$

we have $\sum\limits_i \alpha_iB(v_i,w_i) - \sum\limits_j \beta_jB(v_j,w_j) = B(v+v',w) - B(v,w) - B(v',w)$

(this is just applying $L$ to both sides)

$= B(v+v',w) - [B(v,w) + B(v',w)] = B(v+v',w) - B(v+v',w) = 0$ (since $B$ is bilinear), so

$\sum\limits_i \alpha_iB(v_i,w_i) = \sum\limits_j \beta_jB(v_j,w_j)$.

Claims 1 & 2 show we have "a" construction satisfying the UMP of a tensor product, justifying our use of $\otimes$.

So we have shown existence (but not uniqueness up to isomorphism, but that part is easy) of a tensor product with the desired UMP. Basically, the algebraic strategy here, is to "forget how we got here", and just use the UMP. But we'd like to get a handle on "how big our tensor product vector space is". So we need to describe a basis.

Well, the only thing we have that is suitable for use as a basis, are the bases we know exist for $V$ and $W$.

Let's call these bases:

$\mathcal{B}_1 = \{e_1,\dots,e_n\}$ and $\mathcal{B_2} = \{f_1,\dots,f_m\}$.

We've established we have a bilinear map $V \times W \to V \otimes W$ which sends $(v,w) \mapsto v\otimes w$.

We thus can *restrict* this map to the subset $\mathcal{B}_1 \times \mathcal{B}_2$.

So let's investigate what $Z' = \text{span}(\otimes(\mathcal{B}_1 \times \mathcal{B}_2))$ is.

That is, which elements of $V \otimes W$ can we reach by considering the linear combinations:

$\sum\limits_k \gamma_k (e_{i_k}\otimes f_{j_k})$?

Let's look at the "$V$" side, first.

We have $v = \sum\limits_i \alpha_ie_i$.

Thus for any $f_j$ we have (by the bilinearity of $\otimes$):

$v\otimes f_j = (\sum\limits_i \alpha_ie_i)\otimes f_j = \sum\limits_i \alpha_i (e_i \otimes f_j)$.

Similarly, we can write $w = \sum\limits_j \beta_j f_j$, and thus by bilinearity:

$v \otimes w = v\otimes (\sum\limits_j \beta_j f_j) = \sum\limits_j \beta_j (v \otimes f_j)$.

Hence:

$v \otimes w = \sum\limits_i\sum\limits_j \alpha_i\beta_j (e_i \otimes f_j)$

which is in our span of (simple) tensors of basis vectors; that is $Z' = Z/U$

(this is the same sum as above if we re-number the $(i,j)$ pairs to be indexed by $k$).

Thus $\dim(V \otimes W) = \dim(V)\dim(W)$.

************************

Finally, after all this, an "easy" example. Let us examine what $\Bbb R \otimes \Bbb R$ could possibly be.

First of all, note that:

$m(x,y) = xy$ is a bilinear map (the expression of its bilinearity is known as the distributive, associative, and commutative laws of multiplication).

Let us suppose we have a bilinear map $B: \Bbb R \times \Bbb R \to V$, for some vector space $V$.

Let us say that $B(1,1) = v_0$. It follows from bilinearity that:

$B(x,y) = xB(1,y) = xyB(1,1) = (xy)v_0$.

Define $L: \Bbb R \to V$ by $L(a) = av_0$.This is clearly a linear map:

$L(a+b) = (a+b)v_0 = av_0 + bv_0 = L(a) + L(b)$
$L(ra) = (ra)v_0 = r(av_0) = rL(a)$;

and we have $B(x,y) = L(xy) = L(m(x,y))$, that is: $B = L \circ m$.

We therefore conclude that $\Bbb R \otimes \Bbb R \cong \Bbb R$, with the tensor product being ordinary real multiplication.
Thanks for the substantial help, Deveno ... always a real help in my understanding of mathematics ... so I really appreciate the help ...

... working through your post in detail now ...

Peter
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top