Understanding tensor contraction

In summary, your teacher is teaching themselves algebraic tensors and is trying to convince themselves of the commonly used tensor index laws. They are confused about the definition of the transpose of the identity map and the evaluation map.
  • #1
jdstokes
523
1
Hi all,

I'm teaching myself the algebraic side of tensors and I was wondering if you would be able to clarify a few things for me.

I prefer to think of a tensor in the set theoretic manner as a multi-linear mapping taking several copies of a vector space and its dual space to the base field.

I'm trying to convince myself about all of the commonly used tensor index laws from this. If we consider the basis vectors of each space as tensors themselves, then it is clear that a basis for the (k,l) tensor space can be constructed by suitable tensor products of the basis vectors and their duals.

Consider any basis vector [itex]\hat{\theta}^{(\alpha)}[/itex] of the dual space. This basis vector forms a (0,1) tensor
[itex]\hat{\theta}^{(\alpha)} : T_p \to \mathbb{R}[/itex] such that [itex]\hat{\theta}^{(\alpha)}: \hat{e}_{(\beta)} \mapsto 1[/itex] if [itex]\alpha = \beta[/itex] and zero otherwise.

If we take the tensor product [itex]\hat{\theta}^{(\alpha)} \otimes \hat{e}_{(\beta)}[/itex] we get a (1,1) tensor [itex]T_p^\ast \times T_p \to \mathbb{R}; (\omega,v) : \mapsto \hat{\theta}^{(\alpha)}(v)\hat{e}_{(\beta)}(\omega) = v_\alpha \hat{e}_{(\beta)}(\omega) [/itex].

Now I think we can say that [itex]\hat{e}_{(\beta)}(\omega) = \omega^\beta[/itex] if we treat [itex]T_p[/itex] as the dual space of [itex]T_p^\ast[/itex] which gives [itex]\hat{\theta}^{(\alpha)}(v)\hat{e}_{(\beta)}(\omega) = v_\alpha \omega^\beta[/itex].

I'm not eactly sure how to show that this the Kronecker delta. Is this just the way Kronecker delta is defined?

Thanks
 
Last edited:
Physics news on Phys.org
  • #2
Well, your trouble is due to the fact that's not the Kronecker delta!


You have the identity map [itex]V \to V[/itex].

The Kronecker delta is a transpose the identity map: it's a map [itex]\mathbb{R} \to V^* \otimes V[/itex], which is effectively the same thing as choosing an element of [itex] V^* \otimes V[/itex], then we get (with respect to a basis and its dual)
[tex]\sum_{i = 1}^n \hat{\theta}^i \otimes \hat{e}_i.[/tex]

We get the evaluation map if we transpose the other way to get a map [itex]V^* \otimes V \to \mathbb{R}[/itex].
 
Last edited:
  • #3
Hi Hurkyl,

Thanks for replying.

First of all, how do you define the transpose of the identity map [itex]\mathrm{id} : V \to V[/itex]?

I assume by [itex]V^\ast \otimes V[/itex] you mean the vector space of (1,1) tensors?

I'm not sure what you mean by that the last line. How do you define evaluation map?
 
Last edited:
  • #4
By the way, how can the Kronecker delta be a map [itex]\mathbb{R} \to V^* \otimes V[/itex]. Isn't it a tensor so maps from the vector space and its dual?
 
  • #5
Another related question, if you take a (1,1) tensor [itex]T^\mu_\nu[/itex] and multiply it tensorially with a (0,1) tensor [itex]V^\nu[/itex] you get a (0,1) tensor by tensor contraction right?

So let's see this (note I'm using Einstein summation everywhere)

[itex](T^\mu_\nu \hat{e}_{(\mu)} \otimes \hat{\theta}^{(\nu)}) \otimes (V^\nu \hat{e}_{(\nu)}) = T^\mu_\nu V^\nu \hat{e}_{(\mu)} \otimes \hat{\theta}^{(\nu)}\otimes \hat{e}_{(\nu)}[/itex]

so one would expect that

[itex]\hat{\theta}^{(\nu)}\otimes \hat{e}_{(\nu)} : (v,\omega) \mapsto 1 \; \forall (v,\omega) \in T_p \times T_p^\ast[/itex]

but this implies [itex]v^\nu \omega_\nu = 1[/itex] which seems like it must be wrong.
 
  • #6
Yes; [itex]V^* \otimes V[/itex] would be a space of all (1,1)-tensors. But there's a little more information there: it remembers that the dual space is the left factor. In index notation, you'd write an element as [itex]A_i{}^j[/itex]. It is, of course, "naturally isomorphic" to [itex]V \otimes V^*[/itex], but the distinction is there, should it matter to you.

(I'm assuming all vector spaces are finite dimensional)

For [itex]\omega \in V^*[/itex], and [itex]v \in V[/itex], the evaluation map [itex]\epsilon_V : V^* \otimes V \to \mathbb{R}[/itex] is defined by [itex]\epsilon_V(\omega \otimes v) = \omega(v)[/itex].


If you have a map [itex]A : V \to W^*[/itex], you obtain a corresponding map [itex]B : V \otimes W \to \mathbb{R}[/itex] by:
[tex]B(v \otimes w) = A(v)(w).[/tex]
(note that A(v) is a dual vector on W, so we can evaluate it at w)
And we can go in the reverse direction:
[tex]A(v)(w) = B(v \otimes w).[/tex]
This can be generalized: the "Hom-[itex]\otimes[/itex] adjunction". It's statement involves the vector spaces [itex]\hom(V, W)[/itex] of linear maps from V to W.

We can also dualize: the dual of B is a map [itex]B^* : \mathbb{R} \to (V \otimes W)^*[/itex]:
[tex]B^*(1) = B[/tex]



Everything I've said above ignores the question about what tensors really "are". There are natural isomorphisms between, for example,
(1) Elements of [itex]V \otimes W[/itex]
(2) Linear maps [itex]V^* \otimes W^* \to \mathbb{R}[/itex]
(3) Bilinear maps [itex]V^* \times W^* \to \mathbb{R}[/itex]
(4) Linear maps [itex]\mathbb{R} \to V \otimes W[/itex]
(5) Linear maps [itex]V^* \to W[/itex]
(6) Linear maps [itex]W^* \to V[/itex]

I always find the sheer number of ways to interpret a tensor like this is somewhat bewildering! :frown: So, you can think of tensors as multilinear maps if you want, but the tensor algebra is somewhat indifferent to the representation.
 
Last edited:
  • #7
Hurkyl said:
...
Everything I've said above ignores the question about what tensors really "are". There are natural isomorphisms between, for example,
(1) Elements of [itex]V \otimes W[/itex]
(2) Linear maps [itex]V^* \otimes W^* \to \mathbb{R}[/itex]
(3) Bilinear maps [itex]V^* \times W^* \to \mathbb{R}[/itex]
(4) Linear maps [itex]\mathbb{R} \to V \otimes W[/itex]
(5) Linear maps [itex]V^* \to W[/itex]
(6) Linear maps [itex]W^* \to V[/itex]

I always find the sheer number of ways to interpret a tensor like this is somewhat bewildering! :frown: So, you can think of tensors as multilinear maps if you want, but the tensor algebra is somewhat indifferent to the representation...

I don't know if you are really confused about tensors or just wanted to show they are "slippery" and uneasy to catch and get :)
But this is my view:

First you have to define tensor product of two vector spaces, [itex]V \otimes W[/itex] somehow. One of the common ways is your (3): [itex]V \otimes W[/itex] := Bilinear maps [itex]V^* \times W^* \to \mathbb{R}[/itex].
(If you are playing with vector spaces over field of real numbers.)

Only after that has your (2) some meaning (remember [itex]V[/itex] is any vector space thus it can be a dual space of some other space if you wish to...) because now you have defined what does that [itex]\otimes[/itex] mean.

And only after some definition of [itex]V \otimes W[/itex] you can call it's elements tensors ... (1).

(4) is related to common lazyness of physicists: if physicist says tensor (s)he actually means tensor field...
And tensor field is defined as a map "from real space to some tensor space".

Number (5) and (6) are tensor spaces but somehow trivial - these are only (1,1) tensor spaces (well related to spaces [itex]V[/itex] and [itex]W[/itex] because of course it could be that [itex]W = V^* \otimes V^* \otimes V^*[/itex])
 
  • #8
I find it interesting that you come up with such vastly different qualitative descriptions of those 6 spaces -- despite the fact they are all 'the same'.

(In the same sense that V and [itex]V^{**}[/itex] are 'the same')
 
  • #9
tensors are combinations of vectors and functions on vectors. contracting means you have one of each type and you evaluate the function on the vector (or multivector).
 
  • #10
Hurkyl said:
(In the same sense that V and [itex]V^{**}[/itex] are 'the same')

I guess you mean this:
[itex]\forall A \in V^{**} \exists ! a \in V: \forall \alpha \in V^*: A(\alpha)=\alpha(a)[/itex]

OK, if you have somehow defined tensor product of two vector spaces - [itex]V \otimes W[/itex] - then by above + abstractness of all used vector spaces where V, W can stand for anything you can do magic... :-)

(And if you define contraction of tensors + use representational theorem for linear 1-forms things are even more interesting...)

I think that everything depends on which way you define [itex]V \otimes W[/itex].

And excuse me for my mistake: (4) is of course not general tensor field, I simply overlooked linearity of the map.
 
  • #11
mathwonk said:
tensors are combinations of vectors and functions on vectors. contracting means you have one of each type and you evaluate the function on the vector (or multivector).
You mean linear functions on vectors - and those I prefer to call forms.
And please don't talk about contraction before you define what a tensor is... Because if you do (and people do) than everything related to tensors shrinks to manupulation of components...
And still I find your description somehow inaccurate: you should specify what does your vectors mean in every single place in your statement. (e.g. [tex]\alpha \in V^*[/tex] is one-form on [tex]V[/tex] but vector in [tex]V^*[/tex])
 

1. What is tensor contraction?

Tensor contraction is a mathematical operation used in the field of linear algebra to multiply and sum tensors. It involves multiplying corresponding elements of two tensors and then summing the products to get a single value.

2. Why is tensor contraction important?

Tensor contraction is important because it allows for the simplification and manipulation of complex tensor equations. It also plays a crucial role in many areas of physics and engineering, such as in the study of fluid mechanics and electromagnetism.

3. How is tensor contraction different from matrix multiplication?

Tensor contraction differs from matrix multiplication in that it involves the multiplication and summing of multidimensional arrays, rather than just two-dimensional matrices. In tensor contraction, the dimensions of the tensors being multiplied must match, whereas in matrix multiplication, the number of columns in the first matrix must match the number of rows in the second matrix.

4. What are some real-world applications of tensor contraction?

Tensor contraction has a wide range of applications in various fields, including physics, engineering, and computer science. It is used in the study of fluid dynamics, quantum mechanics, image and signal processing, and machine learning algorithms, among others.

5. Are there any resources available for learning more about tensor contraction?

Yes, there are many resources available for learning more about tensor contraction. These include textbooks, online tutorials, and videos that explain the concept and provide examples of its applications. Additionally, there are also software packages and libraries that can perform tensor contraction calculations and help with understanding the concept in a practical way.

Similar threads

Replies
3
Views
1K
  • Differential Geometry
Replies
12
Views
3K
Replies
5
Views
1K
  • Differential Geometry
Replies
3
Views
1K
Replies
6
Views
2K
  • Differential Geometry
Replies
15
Views
4K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
208
  • Differential Geometry
Replies
6
Views
2K
  • General Math
Replies
4
Views
1K
Back
Top