How Does Tensor Index Contraction Work?

  • Thread starter Thread starter whatisreality
  • Start date Start date
  • Tags Tags
    Indices Tensors
Click For Summary
SUMMARY

The discussion centers on the concept of tensor index contraction, specifically addressing how repeated indices in tensor equations imply summation rather than cancellation. Participants clarify that indices must be treated as bound variables, leading to scalar equations based on the number of free indices. The Einstein summation convention is emphasized, where matching upper and lower indices indicates an implied summation. The conversation concludes that while some indices can be contracted, others cannot, depending on their association with different tensors.

PREREQUISITES
  • Understanding of tensor notation and operations
  • Familiarity with the Einstein summation convention
  • Knowledge of dual vectors and tangent spaces
  • Basic principles of General Relativity
NEXT STEPS
  • Study the implications of the Einstein summation convention in tensor calculus
  • Learn about the properties of dual vectors and their transformations
  • Explore examples of tensor contractions in General Relativity
  • Investigate the role of free and bound indices in tensor equations
USEFUL FOR

Students and professionals in physics, particularly those studying General Relativity, mathematicians working with tensors, and anyone interested in advanced mathematical concepts related to tensor calculus.

whatisreality
Messages
286
Reaction score
1

Homework Statement


We've been told there's this operation called 'contraction' where if you have a superscript and a subscript that are the same they cancel. I don't understand how that works, partly in the sense that we haven't got round to what the superscripts and subscripts actually mean, and partly because at first, having a superscript and subscript that were repeated meant summing over that index. So I'm not sure now when something has to be summed over and when you can just drop them... it came up in the context of this question:

Given that ##S_{\mu\nu}^{\mu} T^{\nu\rho}_{\rho\sigma} = U_{\sigma}##

How many terms are in each equation on spacetime (i.e. 4 for dimensions)?

Homework Equations

The Attempt at a Solution


If all those indices that are the same can be contracted then I'm left with just ##T_{\sigma} = U_{\sigma}## which is a very reasonable looking equation but there'd only be one term in each equation (and four equations). So which indices are the ones I can't actually contract? Is it the ##\nu## that doesn't go away because it's attached to different tensors? Because in our lecture notes there is an example where the ##\mu## in ##S^{\mu}T_{\mu}## gets contracted.

Thank you for any help, I really appreciate it. :)
 
Last edited:
Physics news on Phys.org
whatisreality said:
the same they cancel
They do not cancel, but they are summed over. This means that they do not need to appear on both sides of the equation. You cannot just take the indices away from the tensor components.

The sub- vs superscript has to do with what basis vectors you are using (or, more precisely, if you are dealing with a vector in the tangent space or in its dual - this might depend on your level). This is the reason that you can only contract subscripts with superscripts. The subscripts and superscripts have the opposite transformation properties, so the end result is going to have the appropriate transformation properties given by the free indices.

The contraction ##S_\mu T^\mu## is just the product of a dual vector with a tangent vector, which is a scalar. Scalars do not transform and so the contraction does have the right transformation properties.
 
  • Like
Likes   Reactions: whatisreality
Have they taught you the Einstein summation convention, which is that, where a variable name (ie letter) occurs both in an upper and a lower index position in a formula, there is an implied summation of the formula over all possible values of that variable?

I don't like the use of the word 'cancel', as it seems to me to imply the wrong thing. We certainly can't just erase the matched indices! What we can say is that we regard any variable names that have a match of the above type and hence generate an implied sum as bound variables, which means they are summed over. So if a formula has ##n## 'free' (ie unbound) variables in index positions, it actually specifies ##k^n## scalar equations where ##k## is the number of dimensions of the manifold - which will be four in General Relativity.

In some cases the number of equations can be further reduced by using additional information, such as symmetry of some of the tensors, but ##k^n## is the starting number.
 
  • Like
Likes   Reactions: whatisreality and Orodruin
Orodruin said:
They do not cancel, but they are summed over. This means that they do not need to appear on both sides of the equation. You cannot just take the indices away from the tensor components.

The sub- vs superscript has to do with what basis vectors you are using (or, more precisely, if you are dealing with a vector in the tangent space or in its dual - this might depend on your level). This is the reason that you can only contract subscripts with superscripts. The subscripts and superscripts have the opposite transformation properties, so the end result is going to have the appropriate transformation properties given by the free indices.

The contraction ##S_\mu T^\mu## is just the product of a dual vector with a tangent vector, which is a scalar. Scalars do not transform and so the contraction does have the right transformation properties.
All right, I need to look into that properly. We haven't discussed what a contraction actually is yet. What about the contraction ##S_{\nu}T^{\nu}_{\sigma}##? I'm probably using that word wrong, but what I mean is, is writing ##S_{\nu}T^{\nu}_{\sigma}## allowed?
 
andrewkirk said:
Have they taught you the Einstein summation convention, which is that, where a variable name (ie letter) occurs both in an upper and a lower index position in a formula, there is an implied summation of the formula over all possible values of that variable?

I don't like the use of the word 'cancel', as it seems to me to imply the wrong thing. We certainly can't just erase the matched indices! What we can say is that we regard any variable names that have a match of the above type and hence generate an implied sum as bound variables, which means they are summed over. So if a formula has ##n## 'free' (ie unbound) variables in index positions, it actually specifies ##k^n## scalar equations where ##k## is the number of dimensions of the manifold - which will be four in General Relativity.

In some cases the number of equations can be further reduced by using additional information, such as symmetry of some of the tensors, but ##k^n## is the starting number.
Yes, we've been taught the summation convention. Which is what's confusing me. The variable ##\mu## in my equation above can be dropped or can't be? The examples in my notes suggest it can. But it appears twice, so that means it's summed over. If we just remove them from the equation then you don't know to sum over them anymore. From what Orodruin said the ##\nu## can't be dropped, but again with the ##\rho## it could be.
There's only one free variable so then there are ##4^1 = 4## equations, I can see why that is I think. ##\sigma## isn't summed over so you can only have one value of it per equation, but you can't just pick one and ignore the three other possible values of ##\sigma##.
 
whatisreality said:
The variable ##\mu## in my equation above can be dropped or can't be?
It can't be. Even if we adopted a convention that, for a ##\begin{pmatrix}1\\2\end{pmatrix}## vector whose components are ##S^\mu{}_{\mu\nu}##, writing ##S_\nu## implies a component of the result of contracting between the upper index and a lower index, it would be ambiguous because we wouldn't know with which of the two lower indices it had been contracted - ie whether it was ##S^\mu{}_{\mu\nu}## or ##S^\mu{}_{\nu\mu}## - and in general that makes a difference to the result.
whatisreality said:
I'm probably using that word wrong, but what I mean is, is writing ##S_{\nu}T^{\nu}_{\sigma}## allowed?
Yes, it means ##\sum_{\nu=1}^k S_{\nu}T^{\nu}{}_{\sigma}##.

Or, if using an index system that starts at 0 rather than 1 (usual in General Relativity) it would be ##\sum_{\nu=0}^{k-1} S_{\nu}T^{\nu}{}_{\sigma}##, which is a perfectly ordinary scalar formula.

In words, it is the formula for the ##\sigma##-th component of the dual vector that is the result of contracting dual vector ##\tilde S## with the ##\begin{pmatrix}1\\1\end{pmatrix}## tensor ##\mathbf T## on the upper index.
 
  • Like
Likes   Reactions: whatisreality
andrewkirk said:
It can't be. Even if we adopted a convention that, for a ##\begin{pmatrix}1\\2\end{pmatrix}## vector whose components are ##S^\mu{}_{\mu\nu}##, writing ##S_\nu## implies a component of the result of contracting between the upper index and a lower index, it would be ambiguous because we wouldn't know with which of the two lower indices it had been contracted - ie whether it was ##S^\mu{}_{\mu\nu}## or ##S^\mu{}_{\nu\mu}## - and in general that makes a difference to the result.
Yes, it means ##\sum_{\nu=1}^k S_{\nu}T^{\nu}{}_{\sigma}##.

Or, if using an index system that starts at 0 rather than 1 (usual in General Relativity) it would be ##\sum_{\nu=0}^{k-1} S_{\nu}T^{\nu}{}_{\sigma}##, which is a perfectly ordinary scalar formula.

In words, it is the formula for the ##\sigma##-th component of the dual vector that is the result of contracting dual vector ##\tilde S## with the ##\begin{pmatrix}1\\1\end{pmatrix}## tensor ##\mathbf T## on the upper index.
Ok I think I understand. Definitely not as simple as just scrubbing any two symbols that are the same! I'll take a look at what the actual mathematical operation is. Thank you for your help and for such detailed answers, I really appreciate it!
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
3K
  • · Replies 30 ·
2
Replies
30
Views
7K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
7
Views
2K
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K