Tensor Product of Spaces

In summary, the article states that you need d1*d2*...*dn vectors to form a basis for V. Any tensor in V can be expressed in the following form: \sumi=1...R (v1,i \otimes ... \otimes vn,i) where R = - MAX {dj} + \sumj=1..n dj.
  • #1
klackity
65
1
Tell me if this is true:

We are given vector spaces V1, V2, ..., Vn of dimensions d1, d2, ..., dn respectively.

Let V = V1 [tex]\otimes[/tex] V2 [tex]\otimes[/tex] ... [tex]\otimes[/tex] Vn

Claim: Any element v [tex]\in[/tex] V can be represented in the following form:

[tex]\sum[/tex]i=1...R (v1,i [tex]\otimes[/tex] ... [tex]\otimes[/tex] vn,i)

Where R = - MAX {dj} + [tex]\sum[/tex]j=1..n dj

And where vj,i [tex]\in[/tex] Vj.

In other words, there is an upper bound of R on the number of "elementary" tensors wi needed needed to represent a particular tensor v in V (where an "elementary" tensor wi is one which can be written in the form wi = v1 [tex]\otimes[/tex] v2 [tex]\otimes[/tex] ... vn, where vj [tex]\in[/tex] Vj). The upper bound R is simply the sum of all the dj except for the dj0 which is maximal.

Is this true?
 
Physics news on Phys.org
  • #2
Well, you know that the dimension of V is [tex]d_1d_2...d_n[/tex]. So I guess that you probably need this many vectors to express an arbitrary vector in V...
 
  • #3
micromass said:
Well, you know that the dimension of V is [tex]d_1d_2...d_n[/tex]. So I guess that you probably need this many vectors to express an arbitrary vector in V...

Well, yes, you need d1*d2*...*dn vectors to form a basis for V.

But I am asking something slightly different. I'll give a concrete example:

Suppose n=2 and V1 = R2 and V2 = R3* (the space of linear functionals on R3)

Then V = V1 [tex]\otimes[/tex] V2 [tex]\cong[/tex] R6.

Any tensor in V is a 2x3 matrix:

M = [tex]\left[ \begin{array}{cccc} a & b & c \\ d & e & f \end{array} \right][/tex]

Now, the problem is that not every such matrix is the tensor product of two elements v1 and v2 of V1 and V2 respectively. Any maximal-rank matrix is able to be represented as the sum of exactly two (and no fewer) such "elementary" tensors (where an "elementary" tensor is the tensor product of two elements v1 and v2 (need not be basis vectors) of V1 and V2 respectively). A non-maximal rank matrix (i.e., one with rank strictly less than 2 = min{2,3}) can be represented as a sum of fewer "elementary" tensors (e.g., one).

V contains elements M in the correct form such that I can put a vector x (a column vector) to the right of M, and I will get another vector back as a result of that "multiplication". If I put a linear functional (row vector) to the left of it, I will get back another linear functional (row vector). If I put both a vector to the right, and a linear functional to the left, I will get back a real number.

The matrix M itself can be thought of as two linear functionals on R3 (row vectors) attached to two particular vectors (column vectors) in R2. The topmost row of M is a linear functional attached to the vector (1,0) (in column form), and the bottommost row of M is a linear functional attached to the vector (0,1) (in column form). When you put a vector (in R3) to the right of M, you are using those two linear functionals to determine coefficients a1, a2 (for the top and bottom rows, repsectively) which will factor into the sum a1(1,0) + a2(0,1).

You can also paint a similar picture for thinking about multiplying a linear functional (row vector) on the left of M. In this case, M can be thought of as consisting of three vectors (in R2) each attached to one of the row vectors (1,0,0), (0,1,0), and (0,0,1). However, since there are three such linear functionals, they cannot be linearly independent. So you really only need two linear functionals attached (repsectively) to two row vectors, although this time the row vectors may not be basis vectors.

So either way you think about it, you see that you really only need two bits of "elementary" information to represent any particular matrix M in V, even though there are 6 basis elements to V.

The "elementary" bits of information can be represented as a pair (x,x') in V1xV2 (modulo an equivalence relation).

For matrices, the number of "elementary" bits of information you need is just the maximal number of linearly independent rows (or columns) of M. For higher order tensors, I postulate that it is something a bit more complicated, but that it has an upper bound (which is achieved) of R (where R is defined above).
 
Last edited:
  • #4
well it looks true for n=2. can you use induction?
 
  • #5


Yes, this statement is true. The tensor product of vector spaces V1, V2, ..., Vn is defined as the set of all linear combinations of elementary tensors v1 \otimes v2 \otimes ... \otimes vn, where vj \in Vj. The dimension of this tensor product space is given by the product of the dimensions of the individual vector spaces, i.e. dim(V) = d1 * d2 * ... * dn. Therefore, any element v \in V can be represented as a linear combination of elementary tensors, where the number of elementary tensors needed is at most equal to the dimension of V.

The claim given in the statement is stating that there is an upper bound on the number of elementary tensors needed to represent a particular tensor v in V. This upper bound is equal to the sum of all the dimensions of the individual vector spaces, except for the maximum dimension. This makes sense because the maximum dimension will already be represented in the other dimensions, so it does not need to be counted again.

In summary, the statement is true and provides a useful understanding of the properties of tensor products of vector spaces. It highlights the structured and systematic nature of tensor products, and the relationship between the dimensions of the individual vector spaces and the resulting tensor product space.
 

1. What is the definition of the Tensor Product of Spaces?

The Tensor Product of Spaces is a mathematical operation that combines two vector spaces to create a new vector space. It is denoted by the symbol ⊗ and is used to represent the relationship between two vector spaces.

2. How is the Tensor Product of Spaces calculated?

The Tensor Product of Spaces is calculated by taking a linear combination of basis elements from each vector space and then combining them using the tensor product symbol ⊗. The resulting vector space will have a basis that is the Cartesian product of the basis sets of the original vector spaces.

3. What is the significance of the Tensor Product of Spaces?

The Tensor Product of Spaces is significant in many areas of mathematics, including linear algebra, functional analysis, and differential geometry. It is used to define multilinear maps and tensors, which are fundamental concepts in these fields. It also has applications in physics and engineering.

4. How is the Tensor Product of Spaces related to the Outer Product and Inner Product?

The Tensor Product of Spaces is closely related to the Outer Product and Inner Product. The Outer Product is a special case of the Tensor Product that is used to represent the relationship between two vectors, while the Inner Product is a special case that is used to calculate the dot product between two vectors. The Tensor Product is a generalization of both of these operations.

5. Can the Tensor Product of Spaces be extended to more than two vector spaces?

Yes, the Tensor Product of Spaces can be extended to more than two vector spaces. The resulting vector space will have a basis that is the Cartesian product of the basis sets of all the original vector spaces. This can be useful in certain applications where multiple vector spaces need to be combined.

Similar threads

  • Linear and Abstract Algebra
Replies
10
Views
316
  • Linear and Abstract Algebra
Replies
32
Views
3K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
910
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
3K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
972
Replies
2
Views
2K
Back
Top