What Can be Said About the Kernel of a Tensor Product of Linear Maps?

In summary, the conversation discusses the concept of linear maps between vector spaces and their tensor products over a field, and the implications of these maps being injective. The possibility of writing the kernel of the tensor product in terms of the kernels of the individual maps is also explored, with a simple case involving linear functionals on the same vector space provided as an example. Possible implications of this example are discussed, although they are based on quick intuitive guesses rather than formal proofs.
  • #1
ihggin
14
0
Suppose [tex]f_1[/tex] is a linear map between vector spaces [tex]V_1[/tex] and [tex]U_1[/tex], and [tex]f_2[/tex] is a linear map between vector spaces [tex]V_2[/tex] and [tex]U_2[/tex] (all vector spaces over [tex]F[/tex]). Then [tex]f_1 \otimes f_2[/tex] is a linear transformation from [tex]V_1 \otimes_F V_2[/tex] to [tex]U_1 \otimes_F U_2[/tex]. Is there any "nice" way that we can write the kernel of [tex]f_1 \otimes f_2[/tex] in terms of the kernels of [tex]f_1[/tex] and [tex]f_2[/tex]? For example, is it true that [tex]f_1[/tex] and [tex]f_2[/tex] injective implies [tex]f_1 \otimes f_2[/tex] is injective?

I tried assuming [tex]f_1 \otimes f_2[/tex] acting on a general element [tex]\sum v_1 \otimes v_2[/tex] was zero, but the resulting tensor [tex]\sum f_1(v_1) \otimes f_2(v_2)[/tex] is too complicated for me to draw implications for [tex]v_1[/tex] and [tex]v_2[/tex]. It is obvious that [tex]v_1 \in \ker f_1[/tex] or [tex]v_2 \in \ker f_2[/tex] implies that the latter tensor product is 0, but what can be said for the other direction?
 
Physics news on Phys.org
  • #2
I suggest to look at a simple case, when f and g are linear functionals on the same vector space. Then they are given by a pair of (dual) vectors, say v and w, and an element of the tensor product of vector space by itself is essentially a matrix, say A.
Then you will get something like

[tex](f\otimes g)(A)=\langle v,Aw\rangle[/tex]

What can you deduce in this simple case?

I hope my reasoning is roughly correct, but I was making just quick intuitive guesses.
 

1. What is a kernel of tensor product?

The kernel of tensor product is a mathematical concept that refers to the set of all elements in the tensor product space that map to zero. In other words, it is the set of all possible combinations of the basis elements that result in zero.

2. How is the kernel of tensor product used in scientific research?

The kernel of tensor product is used in various fields of science, including physics, computer science, and engineering. It is especially useful in the study of linear transformations and vector spaces, and can be applied to problems involving matrices and tensors.

3. Can the kernel of tensor product be computed?

Yes, the kernel of tensor product can be computed using linear algebra techniques. The process involves finding the null space of the tensor product matrix, which consists of all the column vectors that result in a zero vector when multiplied by the matrix.

4. What is the relationship between the kernel of tensor product and its dimensions?

The dimension of the kernel of tensor product is directly related to the rank of the tensor product matrix. The rank is the number of linearly independent columns in the matrix, and the dimension of the kernel is equal to the number of columns minus the rank.

5. How does the kernel of tensor product relate to other mathematical concepts?

The kernel of tensor product is closely related to the concepts of null space and linear independence. It is also connected to eigenvectors and eigenvalues, as the eigenvalues of the tensor product matrix are equal to the lengths of the projection of the eigenvectors onto the kernel.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
239
  • Linear and Abstract Algebra
Replies
1
Views
823
Replies
5
Views
3K
Replies
6
Views
4K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
351
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
981
  • Linear and Abstract Algebra
Replies
1
Views
927
Back
Top