I Having trouble understanding Tensor Contraction

  • I
  • Thread starter Thread starter paperplane
  • Start date Start date
  • Tags Tags
    Contraction Tensor
AI Thread Summary
Tensor contraction reduces a tensor's rank, transforming a rank-2 tensor in three dimensions, which has nine elements, into a scalar, thereby raising questions about the loss of information. The discussion highlights that while this process seems to discard data, it actually generates new information, similar to how a dot product operates. The legitimacy of tensor contraction is affirmed through its utility in producing invariant quantities, essential in fields like general relativity. The Ricci scalar, resulting from such contractions, provides insights into curvature, although the complete loss of information during this process remains a topic of intrigue. Ultimately, tensor contraction is a fundamental operation that balances information loss with the generation of meaningful scalar values.
paperplane
Messages
3
Reaction score
0
TL;DR Summary
Having trouble understanding tensor contraction
I'm having trouble understanding tensor contraction. So for example, for something like
AuvBvu, would this equal to some scalar?
 
Last edited:
Physics news on Phys.org
Yes.
 
  • Like
Likes Chestermiller and malawi_glenn
Since this topic was just started: I also have problems understanding tensor contraction. Not the process, but the motivation behind it. If you have a tensor of rank 2, let's say in 3 dimensions, then this tensor contains 9 elements. If you contract it to a scalar, you are left with one element. Why is this legitimate? What happens to all the information that the original tensor contained? You don't need it, you don't want it, you just throw it out?
 
Rick16 said:
Since this topic was just started: I also have problems understanding tensor contraction. Not the process, but the motivation behind it. If you have a tensor of rank 2, let's say in 3 dimensions, then this tensor contains 9 elements. If you contract it to a scalar, you are left with one element. Why is this legitimate? What happens to all the information that the original tensor contained? You don't need it, you don't want it, you just throw it out?
The simplest example of tensor contraction is the inner product of a vector with itself to give its length:
$$a = a^{\mu}a_{\mu}$$That's occasionally useful.
 
PeroK said:
The simplest example of tensor contraction is the inner product of a vector with itself to give its length:
$$a = a^{\mu}a_{\mu}$$That's occasionally useful.
Thank you! This answer is enormously helpful.
 
This is my pragmatic take on tensors and contractions. It is a way to ensure that you have an object that transforms in a certain way, one special such transformation is that it does not transform at all, i.e. it is invariant.

For instance, if I know that ##T^{\mu \nu} ## is a rank-2 contravariant tensor, and if ##v_\alpha## is rank-1 covariant tensor, I know that ##T^{\mu \nu}a_\mu ## will transform as a rank-1 contravariant tensor, and that ##T^{\mu \nu}a_\mu a_\nu ## will be invariant (a scalar).
 
At this place it's important to be pedantic. ##T^{\mu \nu}## is not a tensor but contravariant components of a 2nd-rank tensor. I v_{\alpha} are rank-1-covariant tensor components, then ##T^{\mu \nu} v_{\nu}## are contra-variant rank-1 tensor components (aka contravariant vector components).

Tensors are multilinear functions, which map ##m## vectors and ##n## dual vectors to the real numbers (taking the case we talk about real vector spaces as needed in classical mechanics and relativity). They are completely independent of the choice of any basis and the correponding dual basis. They are invariants under basis transformations, and the rules how to transform the corresponding components follow from this invariance.
 
  • Love
Likes malawi_glenn
vanhees71 said:
At this place it's important to be pedantic. Tμν is not a tensor but contravariant components of a 2nd-rank tensor.
True that :)
 
Rick16 said:
Why is this legitimate?
Is a dot product legitimate?
 
  • #10
Vanadium 50 said:
Is a dot product legitimate?
Actually, comment #4 already answered my question. I did not think of the dot product between two vectors as a tensor contraction. When I take the dot product of a vector with itself, I lose all the directional information of the vector, but I obtain new information. Before #4 I only saw the loss of information as a result of a tensor contraction, and therefore I wondered about the legitimacy of the process. But since new information is produced in the process, I now see the use of it (and the legitimacy).
 
  • #11
I want to come back to my question about the meaning behind tensor contractions. I just finished Susskind’s theoretical minimum volume 4. On page 325 he writes: “I don’t know any particular physical significance or geometric significance to the Ricci tensor or the curvature scalar.” This then means that we don’t know either what information gets lost when the Riemann tensor is contracted to the Ricci tensor and further down to the Ricci scalar. I had actually this contraction in the back of my head when I asked my question in #3. We don’t know what information gets lost, but we contract anyway, and we hope for the best, i.e. we hope that the information that we need the resulting tensor to contain does not get kicked out in the process? Is this what happens here? And then we contract even further until we are left with a single number. I find this particularly intriguing. So the Ricci scalar is just some number that says something about the curvature? Is this all that I need to know about it? Is this all that I can know about it?
 
  • #12
Rick16 said:
So the Ricci scalar is just some number that says something about the curvature? Is this all that I need to know about it? Is this all that I can know about it?
Certainly not. For example:
  • In 2D: the full Riemann tensor is expressible entirely in terms of the Ricci scalar.
  • In 3D: the full Riemann tensor is expressible entirely in terms of the Ricci tensor and scalar.
  • In any dimension: "When the scalar curvature is positive at a point, the volume of a small geodesic ball about the point has smaller volume than a ball of the same radius in Euclidean space. On the other hand, when the scalar curvature is negative at a point, the volume of a small ball is larger than it would be in Euclidean space." (https://en.wikipedia.org/wiki/Scalar_curvature); and: "Geometrically, the Ricci curvature is the mathematical object that controls the growth rate of the volume of metric balls in a manifold." (https://mathworld.wolfram.com/RicciCurvatureTensor.html); and: "The Ricci tensor can be characterized by measurement of how a shape is deformed as one moves along geodesics in the space. In general relativity, which involves the pseudo-Riemannian setting, this is reflected by the presence of the Ricci tensor in the Raychaudhuri equation." (https://en.wikipedia.org/wiki/Ricci_curvature)
 
  • #13
Tensor contraction : if we have tensor space E p,q over E for p>0 & q>0 then for every 1\< i\<p and 0\<j\<q ( p is up indicator q down one) there's linear projection
c (i j); E p q -> E p-1 q-1 such that every folding tensor's z=V1,...,Vp , v'1,..., V'q Ther is attached C I j(z) := < Vi , V'j> V1... Vi-1 Vi+1 (under i).... Next question someone explained it earlier tensor (curvature) ,Riemann -Christoffl is expressible in terms of Levi-Civita connection , in work G.Ricci appears term covariant derivative for the first time
 
  • #14
arturwojciechowicz said:
Tensor contraction : if we have tensor space E p,q over E for p>0 & q>0 then for every 1\< i\<p and 0\<j\<q ( p is up indicator q down one) there's linear projection
c (i j); E p q -> E p-1 q-1 such that every folding tensor's z=V1,...,Vp , v'1,..., V'q Ther is attached C I j(z) := < Vi , V'j> V1... Vi-1 Vi+1 (under i).... Next question someone explained it earlier tensor (curvature) ,Riemann -Christoffl is expressible in terms of Levi-Civita connection , in work G.Ricci appears term covariant derivative for the first time
See the guide for using Latex for mathematics:

https://www.physicsforums.com/help/latexhelp/
 
  • Love
  • Like
Likes arturwojciechowicz and vanhees71
  • #15
That's interesting problem ,first tensor contraction, next is Ricci tensor and most interesting connection, I'll try to be better prepared
 

Similar threads

Replies
4
Views
917
Replies
4
Views
747
Replies
4
Views
838
Replies
9
Views
1K
Replies
6
Views
2K
Replies
33
Views
4K
Back
Top