Tensor Product of Spaces


by klackity
Tags: product, spaces, tensor
klackity
klackity is offline
#1
Mar11-11, 12:20 PM
P: 65
Tell me if this is true:

We are given vector spaces V1, V2, ..., Vn of dimensions d1, d2, ..., dn respectively.

Let V = V1 [tex]\otimes[/tex] V2 [tex]\otimes[/tex] ... [tex]\otimes[/tex] Vn

Claim: Any element v [tex]\in[/tex] V can be represented in the following form:

[tex]\sum[/tex]i=1...R (v1,i [tex]\otimes[/tex] ... [tex]\otimes[/tex] vn,i)

Where R = - MAX {dj} + [tex]\sum[/tex]j=1..n dj

And where vj,i [tex]\in[/tex] Vj.

In other words, there is an upper bound of R on the number of "elementary" tensors wi needed needed to represent a particular tensor v in V (where an "elementary" tensor wi is one which can be written in the form wi = v1 [tex]\otimes[/tex] v2 [tex]\otimes[/tex] ... vn, where vj [tex]\in[/tex] Vj). The upper bound R is simply the sum of all the dj except for the dj0 which is maximal.

Is this true?
Phys.Org News Partner Science news on Phys.org
Internet co-creator Cerf debunks 'myth' that US runs it
Astronomical forensics uncover planetary disks in Hubble archive
Solar-powered two-seat Sunseeker airplane has progress report
micromass
micromass is online now
#2
Mar11-11, 01:35 PM
Mentor
micromass's Avatar
P: 16,703
Well, you know that the dimension of V is [tex]d_1d_2...d_n[/tex]. So I guess that you probably need this many vectors to express an arbitrary vector in V...
klackity
klackity is offline
#3
Mar11-11, 06:40 PM
P: 65
Quote Quote by micromass View Post
Well, you know that the dimension of V is [tex]d_1d_2...d_n[/tex]. So I guess that you probably need this many vectors to express an arbitrary vector in V...
Well, yes, you need d1*d2*...*dn vectors to form a basis for V.

But I am asking something slightly different. I'll give a concrete example:

Suppose n=2 and V1 = R2 and V2 = R3* (the space of linear functionals on R3)

Then V = V1 [tex]\otimes[/tex] V2 [tex]\cong[/tex] R6.

Any tensor in V is a 2x3 matrix:

M = [tex]\left[ \begin{array}{cccc} a & b & c \\ d & e & f \end{array} \right][/tex]

Now, the problem is that not every such matrix is the tensor product of two elements v1 and v2 of V1 and V2 respectively. Any maximal-rank matrix is able to be represented as the sum of exactly two (and no fewer) such "elementary" tensors (where an "elementary" tensor is the tensor product of two elements v1 and v2 (need not be basis vectors) of V1 and V2 respectively). A non-maximal rank matrix (i.e., one with rank strictly less than 2 = min{2,3}) can be represented as a sum of fewer "elementary" tensors (e.g., one).

V contains elements M in the correct form such that I can put a vector x (a column vector) to the right of M, and I will get another vector back as a result of that "multiplication". If I put a linear functional (row vector) to the left of it, I will get back another linear functional (row vector). If I put both a vector to the right, and a linear functional to the left, I will get back a real number.

The matrix M itself can be thought of as two linear functionals on R3 (row vectors) attached to two particular vectors (column vectors) in R2. The topmost row of M is a linear functional attached to the vector (1,0) (in column form), and the bottommost row of M is a linear functional attached to the vector (0,1) (in column form). When you put a vector (in R3) to the right of M, you are using those two linear functionals to determine coefficients a1, a2 (for the top and bottom rows, repsectively) which will factor into the sum a1(1,0) + a2(0,1).

You can also paint a similar picture for thinking about multiplying a linear functional (row vector) on the left of M. In this case, M can be thought of as consisting of three vectors (in R2) each attached to one of the row vectors (1,0,0), (0,1,0), and (0,0,1). However, since there are three such linear functionals, they cannot be linearly independent. So you really only need two linear functionals attached (repsectively) to two row vectors, although this time the row vectors may not be basis vectors.

So either way you think about it, you see that you really only need two bits of "elementary" information to represent any particular matrix M in V, even though there are 6 basis elements to V.

The "elementary" bits of information can be represented as a pair (x,x') in V1xV2 (modulo an equivalence relation).

For matrices, the number of "elementary" bits of information you need is just the maximal number of linearly independent rows (or columns) of M. For higher order tensors, I postulate that it is something a bit more complicated, but that it has an upper bound (which is achieved) of R (where R is defined above).

mathwonk
mathwonk is offline
#4
Mar12-11, 02:41 PM
Sci Advisor
HW Helper
mathwonk's Avatar
P: 9,428

Tensor Product of Spaces


well it looks true for n=2. can you use induction?


Register to reply

Related Discussions
Tensor product vector spaces over complex and real Linear & Abstract Algebra 4
Question re. tensor product of v. spaces Linear & Abstract Algebra 9
Tensor product of vector spaces: confusion... Quantum Physics 4
Tensor product of vector spaces Linear & Abstract Algebra 12
Tensor product of vector spaces Linear & Abstract Algebra 33