Tensor Contraction: Contracting ##\mu## with ##\alpha##?

Click For Summary
SUMMARY

The discussion focuses on the concept of tensor contraction, specifically contracting the indices ##\mu## and ##\alpha## in the context of General Relativity (GR). Participants clarify that this process involves replacing one index with another rather than simple cancellation. The working examples provided include the equation ##R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu}## and the use of covariant derivatives, emphasizing the importance of the metric's constancy in GR. The final equations illustrate the detailed steps involved in achieving the contraction.

PREREQUISITES
  • Understanding of tensor notation and indices in mathematics.
  • Familiarity with General Relativity concepts, particularly covariant derivatives.
  • Knowledge of matrix operations, specifically taking traces.
  • Proficiency in manipulating tensor equations and summation conventions.
NEXT STEPS
  • Study the properties of covariant derivatives in General Relativity.
  • Learn about the implications of metric constancy in tensor calculus.
  • Explore advanced tensor contraction techniques in higher-dimensional spaces.
  • Investigate the role of the Riemann curvature tensor in GR and its contractions.
USEFUL FOR

Mathematicians, physicists, and students specializing in General Relativity, tensor calculus, or advanced mathematical physics will benefit from this discussion.

unscientific
Messages
1,728
Reaction score
13
tensor4.png


What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
 
Physics news on Phys.org
unscientific said:
What do they mean by 'Contract ##\mu## with ##\alpha##'?
It means "replace ##\mu## by ##\alpha## -- or vice versa".
 
strangerep said:
It means "replace ##\mu## by ##\alpha## -- or vice versa".
How can we simply do that?
 
unscientific said:
How can we simply do that?
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
 
strangerep said:
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
Summing up the diagonal, i.e. ##\sum M_{ii}##.
 
unscientific said:
Summing up the diagonal, i.e. ##\sum M_{ii}##.
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
 
strangerep said:
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
 
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}
 
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
 
  • #10
strangerep said:
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.

So, the working is

\nabla_\gamma R^\mu _{\nu \alpha \beta} + \nabla_\beta R^\mu _{\nu \gamma \alpha} + \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0

\delta_\mu ^\alpha \nabla_\gamma R^\mu _{\nu \alpha \beta} + \delta_\mu ^\alpha \nabla_\beta R^\mu _{\nu \gamma \alpha} + \delta_\mu ^\alpha \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0

\nabla_\gamma g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \alpha \beta} + \nabla_\beta g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \gamma \alpha} + \nabla_\alpha g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \beta \gamma} = 0

\nabla_\gamma R^\alpha _{\nu \alpha \beta} + \nabla_\beta R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha R^\alpha _{\nu \beta \gamma} = 0

\nabla_\gamma g^{\nu \gamma} R^\alpha _{\nu \alpha \beta} + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha g^{\nu \gamma} R^\alpha _{\nu \beta \gamma} = 0

2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?
 
  • #11
unscientific said:
So, the working is

2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?

No, because R is defined to be
R^\alpha _\alpha = g^{\nu \gamma} R^\alpha _{\nu \alpha \gamma}
and you can obtain that form by exchange of the last two indices, hence the minus sign
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 38 ·
2
Replies
38
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 1 ·
Replies
1
Views
650
  • · Replies 8 ·
Replies
8
Views
2K