Undergrad Basis of a Tensor Product - Theorem 10.2 - Another Question

Click For Summary
The discussion centers on understanding the proof of Theorem 10.2 from Bruce N. Cooperstein's "Advanced Linear Algebra," specifically regarding the existence and mechanics of the multilinear map γ' from X to Z'. Participants seek clarification on how γ' is defined and how it operates on elements in X\X'. The conversation highlights that Z is defined as a space of functions with finite support, while Z' is a subspace containing functions supported in X'. The existence of the multilinear map γ' is established through its agreement with a defined map ξ, which is shown to be unique due to the basis properties of X'. The discussion emphasizes the importance of understanding these mappings in the context of tensor products.
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Bruce N. Coopersteins book: Advanced Linear Algebra (Second Edition) ... ...

I am focused on Section 10.1 Introduction to Tensor Products ... ...

I need help with another aspect of the proof of Theorem 10.2 regarding the basis of a tensor product ... ...Theorem 10.2 reads as follows:
?temp_hash=f5d3c296ba033b45e904bd30300693b1.png
A diagram involving the mappings \iota and \gamma' is as follows:
?temp_hash=f5d3c296ba033b45e904bd30300693b1.png


My questions are as follows:Question 1

How do we know that there exists a multilinear map \gamma' \ : \ X \longrightarrow Z' ?
Question 2What happens (what are the 'mechanics') under the mapping \gamma' ... ... to the elements in X\X' ( that is X - X')? How can we be sure that these elements end up in Z' and not in Z\Z'? (see Figure 1 above)
Hope someone can help ...

Peter
 

Attachments

  • Cooperstein - 3 - Theorem 10.2     ....        ....png
    Cooperstein - 3 - Theorem 10.2 .... ....png
    57.5 KB · Views: 846
  • Figure 1 - Cooperstein - Theorem 10.2             ... ... .png
    Figure 1 - Cooperstein - Theorem 10.2 ... ... .png
    36.4 KB · Views: 714
Physics news on Phys.org
Re question 1:
Let ##V^\dagger\equiv V_1\times,...,\times V_m## and let's use angle brackets \langle...\rangle to enclose components of Cartesian product spaces. Let the components of \mathscr{B}_j be v_{j1},...,v_{jn_j} where n_j is the dim of V_j.
You haven't said what Z is but let's assume it's the infinite-dimensional vector space over field F with base set
$$\mathscr{B}^Z\equiv \{\langle u_1,...,u_m\rangle\ |\ \forall k:\ u_k\in V_k\}$$
More formally, Z is the set of all functions from \mathscr{B}^Z to F for which each such function has finite support (ie is nonzero on only finitely many input values).
Z' is the subset of Z containing only functions whose support lies in X', and it is easily shown to be a subspace.
Define the map \xi:X'\to Z' that maps each Cartesian product of basis vectors \langle v_{1i_1},...,v_{mi_m}\rangle to the function that returns zero for every input except \langle v_{1i_1},...,v_{mi_m}\rangle, for which it returns 1_F. Note that the image of \xi is a basis for Z'.
Then a map \gamma':V^\dagger\to Z' is multilinear, and agrees with \xi on X', if and only if it satisfies:
\begin{align*}
\gamma'\left(\left\langle \sum_{i_1}a_{1i_1}v_{1i_1},\, ...\, ,\sum_{i_m}a_{mi_m}v_{mi_m}\right\rangle\right)
&=\sum_{i_1}\sum_{i_2}\ ...\ \sum_{i_m} \prod_{k=1}^m a_{ki_k}\gamma'\left(\left\langle
v_{1i_1},\, ...\, ,v_{mi_m}\right\rangle\right)
\\&=
\sum_{i_1}\sum_{i_2}\ ...\ \sum_{i_m} \prod_{k=1}^m a_{ki_k}
\xi\left(\langle v_{1i_1},...,v_{mi_m}\rangle\right)
\end{align*}
where the first equality implements multilinearity and the second implements the requirement to agree with \xi.
Since X' is a basis for V^\dagger, it follows that such a map \gamma' exists. It is unique because the representation \sum_{i_k}a_{ki_k}v_{ki_k} of the kth coordinate of the input to \gamma' is unique.
 
Andrew,

Going through your post carefully shortly ...

Cooperstein provides a definition of Z in the introduction to Section 10.1 including in the proof of Theorem 10.1 ...

Relevant text from Cooperstein is as follows:
?temp_hash=a66ee8cd377f7c6b3c284cd0f239d602.png

?temp_hash=a66ee8cd377f7c6b3c284cd0f239d602.png

?temp_hash=a66ee8cd377f7c6b3c284cd0f239d602.png

?temp_hash=a66ee8cd377f7c6b3c284cd0f239d602.png


Hope that helps ...

Peter
 

Attachments

  • Cooperstein - 1 - Section 10.1 - PART 1     ....png
    Cooperstein - 1 - Section 10.1 - PART 1 ....png
    70.8 KB · Views: 638
  • Cooperstein - 2 - Section 10.1 - PART 2     ....png
    Cooperstein - 2 - Section 10.1 - PART 2 ....png
    38.4 KB · Views: 664
  • Cooperstein - 3 - Section 10.1 - PART 3     ....png
    Cooperstein - 3 - Section 10.1 - PART 3 ....png
    35.3 KB · Views: 654
  • Cooperstein - 4 - Section 10.1 - PART 4     ....png
    Cooperstein - 4 - Section 10.1 - PART 4 ....png
    33.1 KB · Views: 656
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 28 ·
Replies
28
Views
4K
  • · Replies 13 ·
Replies
13
Views
3K