Register to reply 
Tensor Product of Spaces 
Share this thread: 
#1
Mar1111, 12:20 PM

P: 65

Tell me if this is true:
We are given vector spaces V_{1}, V_{2}, ..., V_{n} of dimensions d_{1}, d_{2}, ..., d_{n} respectively. Let V = V_{1} [tex]\otimes[/tex] V_{2} [tex]\otimes[/tex] ... [tex]\otimes[/tex] V_{n} Claim: Any element v [tex]\in[/tex] V can be represented in the following form: [tex]\sum[/tex]_{i=1...R} (v_{1,i} [tex]\otimes[/tex] ... [tex]\otimes[/tex] v_{n,i}) Where R =  MAX {d_{j}} + [tex]\sum[/tex]_{j=1..n} d_{j} And where v_{j,i} [tex]\in[/tex] V_{j}. In other words, there is an upper bound of R on the number of "elementary" tensors w_{i} needed needed to represent a particular tensor v in V (where an "elementary" tensor w_{i} is one which can be written in the form w_{i} = v_{1} [tex]\otimes[/tex] v_{2} [tex]\otimes[/tex] ... v_{n}, where v_{j} [tex]\in[/tex] V_{j}). The upper bound R is simply the sum of all the d_{j} except for the d_{j0} which is maximal. Is this true? 


#2
Mar1111, 01:35 PM

Mentor
P: 18,040

Well, you know that the dimension of V is [tex]d_1d_2...d_n[/tex]. So I guess that you probably need this many vectors to express an arbitrary vector in V...



#3
Mar1111, 06:40 PM

P: 65

But I am asking something slightly different. I'll give a concrete example: Suppose n=2 and V_{1} = R^{2} and V_{2} = R_{3}^{*} (the space of linear functionals on R^{3}) Then V = V_{1} [tex]\otimes[/tex] V_{2} [tex]\cong[/tex] R^{6}. Any tensor in V is a 2x3 matrix: M = [tex]\left[ \begin{array}{cccc} a & b & c \\ d & e & f \end{array} \right][/tex] Now, the problem is that not every such matrix is the tensor product of two elements v_{1} and v_{2} of V_{1} and V_{2} respectively. Any maximalrank matrix is able to be represented as the sum of exactly two (and no fewer) such "elementary" tensors (where an "elementary" tensor is the tensor product of two elements v_{1} and v_{2} (need not be basis vectors) of V_{1} and V_{2} respectively). A nonmaximal rank matrix (i.e., one with rank strictly less than 2 = min{2,3}) can be represented as a sum of fewer "elementary" tensors (e.g., one). V contains elements M in the correct form such that I can put a vector x (a column vector) to the right of M, and I will get another vector back as a result of that "multiplication". If I put a linear functional (row vector) to the left of it, I will get back another linear functional (row vector). If I put both a vector to the right, and a linear functional to the left, I will get back a real number. The matrix M itself can be thought of as two linear functionals on R^{3} (row vectors) attached to two particular vectors (column vectors) in R^{2}. The topmost row of M is a linear functional attached to the vector (1,0) (in column form), and the bottommost row of M is a linear functional attached to the vector (0,1) (in column form). When you put a vector (in R^{3}) to the right of M, you are using those two linear functionals to determine coefficients a_{1}, a_{2} (for the top and bottom rows, repsectively) which will factor into the sum a_{1}(1,0) + a_{2}(0,1). You can also paint a similar picture for thinking about multiplying a linear functional (row vector) on the left of M. In this case, M can be thought of as consisting of three vectors (in R^{2}) each attached to one of the row vectors (1,0,0), (0,1,0), and (0,0,1). However, since there are three such linear functionals, they cannot be linearly independent. So you really only need two linear functionals attached (repsectively) to two row vectors, although this time the row vectors may not be basis vectors. So either way you think about it, you see that you really only need two bits of "elementary" information to represent any particular matrix M in V, even though there are 6 basis elements to V. The "elementary" bits of information can be represented as a pair (x,x') in V_{1}xV_{2} (modulo an equivalence relation). For matrices, the number of "elementary" bits of information you need is just the maximal number of linearly independent rows (or columns) of M. For higher order tensors, I postulate that it is something a bit more complicated, but that it has an upper bound (which is achieved) of R (where R is defined above). 


Register to reply 
Related Discussions  
Tensor product vector spaces over complex and real  Linear & Abstract Algebra  4  
Question re. tensor product of v. spaces  Linear & Abstract Algebra  9  
Tensor product of vector spaces: confusion...  Quantum Physics  4  
Tensor product of vector spaces  Linear & Abstract Algebra  12  
Tensor product of vector spaces  Linear & Abstract Algebra  33 