Direct product between vectorspaces

In summary, we discussed the tensor product \otimes and how every vector in V \otimes W, with the basis (v_i) and (w_i), can be written as a linear combination of the basis v_i \otimes w_i but not in general as an element of the form v \otimes w, where w and v are elements of W and V respectively. We also looked at examples in R^3 and saw that the tensor product is isomorphic to a vector space of matrices, with a simple characterization of pure tensors. However, most tensors are not pure tensors and cannot be written as an outer product of two vectors. The set of pure tensors is not a vector space but a topological space with a
  • #1
Kontilera
179
24
Hello! I am currently reading the analysis of tensors and have now encountered the tensorproduct, [tex] \otimes [/tex].
I am wondering about the statement that every vector in: [tex]V \otimes W [/tex] (with the basis (v_i) and (w_i)) can be written as a linear combination of the basis: [tex]v_i \otimes w_i ,[/tex] but not in general as an element of the form:[tex] v \otimes w,[/tex] where w and v are elements i W and V.

Which elements can I not reach by the second way? If we set V and W to R^3 it looks like we are comparing a 6-dimensional space to a 9-dimensional space (true?), in that case does it have something to do with the symmetric or antisymmetric components of [tex]V \otimes W [/tex] that can not be reached by [tex] v \otimes w[/tex]?

I am thankful for all help possible. :)
Best Regards
Kontilera
 
Physics news on Phys.org
  • #2
Hey Kontilera and welcome to the forums.

Forgive me if this is out of place, but are these spaces completely independent or do they have any kind of linear dependence between the spaces?

I don't actually know if you can do tensor products on vector spaces that are linearly dependent in some way, but if you can then this is a red flag to make sure that this is handled correctly. (My guess is that these aren't considered that much even if you can have this happen).
 
  • #3
Hello and thanks! :smile:
They are not completely independent. For example we can consider the cotangentspaces for a smooth manifold M at a point p, where:
[tex]T^*_pM \otimes T^*_pM = T^2_pM.[/tex]
 
  • #4
Let V be a vector space of column vectors, and W be a vector space of row vectors. It's pretty easy to see that [itex]V \otimes W[/itex] is (isomorphic to) a vector space of matrices. The tensor product is just the outer product: [itex]v \otimes w = vw[/itex].

Invoking linear algebra, there is a simple characterization of which matrices are pure tensors -- i.e. of the form [itex]v \otimes w[/itex] -- they are the matrices with rank 0 or 1.
 
  • #5
Yeah, I agree that the product of V and W is isomorphic to the space of matrices but I think you miss my question here. Lets say V = W = R^3. Then we get, as expected, nine basis elements:
[tex]e_i \otimes e_j ,[/tex] where i and j runs up to 3.

The elements of this space can however not in general be written as a outer product of two vectors:
[tex] v \in V \;\quad \text{and}\;\quad w \in W .[/tex]

Let us take the standard basis for R^3 in both V and W. We should then be able to find some matrices in [tex] V \otimes W [/tex] that confirm this (is not an outer product).
As I said - when choosing the vectors v and w for:
[tex] v \in V \quad \text{and} \quad w \in W,[/tex]
we are dealing with 6 degrees of freedom while [tex] V \otimes W [/tex] has nine.

Or has I missunderstood something basic?
 
  • #6
I think you're counting the degrees of freedom in making a choice from V OR W, rather than from V AND W. You can see the 9 degrees of freedom by seeing the 9 ways to take the tensor product of standard basis vectors.

In fact, if S and T are disjoint sets, then if V is the free vector space generated by S and W is the free vector space generated by T, then [itex]V \otimes W[/itex] is the free vector space generated by [itex]S \times T[/itex], whereas the one generated by [itex]S \cup T[/itex] is [itex]V \oplus W[/itex].

The free vector space generated by a set S is isomorphic to Rs, where s is the number of elements in S -- just imagine that the indexes are labeled by the elements of S, rather than by the numbers {1, 2, ..., s}.



P.S. "direct product" is usually used for [itex]V \oplus W[/itex], not [itex]V \otimes W[/itex].
 
  • #7
Hmm, thanks for the responses. I think this needs some thinking. Are you saying that every tensor in [tex]V \otimes W[/tex] can be written as a tensorproduct between one vector v in V and another vector w in W? :/

The author said the opposite so I started wondering what the pattern between those tensors of second rank was, for the standard basis.
 
  • #8
Kontilera said:
Hmm, thanks for the responses. I think this needs some thinking. Are you saying that every tensor in [tex]V \otimes W[/tex] can be written as a tensorproduct between one vector v in V and another vector w in W? :/
No, I'm saying that every tensor can be written as a linear combination of such tensors.

Most of them cannot be written as pure tensors; e.g. any matrix of rank 2 or more in the example I mentioned earlier.
 
  • #9
Are you asking about how the set of pure tensors looks? It's not a vector space, but it is a topological space and you can ask about it's geometry. I think the space is six-dimensional in your example, but that has nothing to do with [itex]V \otimes W[/itex].


(I am still a little confused about what you're asking. I thought you originally were asking the question that is answered by my mention of rank 2+ tensors, but then you said you weren't. But your latest post sounds like you're asking that question again)
 
  • #10
Hurkyl said:
Are you asking about how the set of pure tensors looks? It's not a vector space, but it is a topological space and you can ask about it's geometry. I think the space is six-dimensional in your example, but that has nothing to do with [itex]V \otimes W[/itex].
Yeah! I agree, its the once that can not be written as a outer product that I am interested in. As said in the first post,

"I am wondering about the statement that every vector in: [tex]V \otimes W [/tex] (with the basis (v_i) and (w_i)) can be written as a linear combination of the basis: [tex]v_i \otimes w_i ,[/tex] but not in general as an element of the form:[tex] v \otimes w,[/tex] where w and v are elements i W and V.

Which elements can I not reach by the second way?" :)

Is there any way of determining if a 2-rank tensor can be written as a outerproduct?
 
  • #11
A matrix is a pure tensor if and only if it has rank 0 or 1.
 
  • #12
Hurkyl said:
A matrix is a pure tensor if and only if it has rank 0 or 1.

Ah okey, thanks.
I will see if I can find some good lecture notes on this stuff. The concept of 'pure tensor' seems to be the key. Ill come back with questions if there is something more. :)

All the best!
 

1. What is the direct product between vectorspaces?

The direct product between vectorspaces is a mathematical operation that combines two or more vectorspaces to create a new vectorspace. It is denoted by a symbol that looks like a cross between the vectorspaces, and is used to represent the combined space of all possible combinations of elements from the original vectorspaces.

2. How is the direct product between vectorspaces different from the direct sum?

The direct product and direct sum are both ways of combining vectorspaces, but they differ in their underlying operations. The direct sum combines vectorspaces by taking the cartesian product of their elements, while the direct product combines vectorspaces by taking the tensor product of their basis vectors.

3. What are the properties of the direct product between vectorspaces?

The direct product between vectorspaces inherits many properties from the original vectorspaces, such as linearity and associativity. It also has its own unique properties, such as the ability to distribute multiplication over addition.

4. How is the direct product between vectorspaces used in real-world applications?

The direct product between vectorspaces has many applications in physics, engineering, and computer science. It is commonly used in quantum mechanics to describe composite systems, in signal processing to analyze multidimensional signals, and in database design to represent hierarchical data structures.

5. Can the direct product between vectorspaces be applied to infinite-dimensional vectorspaces?

Yes, the direct product between vectorspaces can be applied to infinite-dimensional vectorspaces. In this case, the resulting direct product vectorspace will also be infinite-dimensional, and the operation will follow the same rules as in the finite-dimensional case.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
234
  • Quantum Physics
Replies
4
Views
658
  • Differential Geometry
Replies
3
Views
3K
  • Special and General Relativity
Replies
4
Views
787
  • Advanced Physics Homework Help
Replies
3
Views
886
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Other Physics Topics
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
346
  • Calculus and Beyond Homework Help
Replies
0
Views
448
Back
Top