Tensor Contraction: Contracting ##\mu## with ##\alpha##?

unscientific
Messages
1,728
Reaction score
13
tensor4.png


What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
 
Physics news on Phys.org
unscientific said:
What do they mean by 'Contract ##\mu## with ##\alpha##'?
It means "replace ##\mu## by ##\alpha## -- or vice versa".
 
strangerep said:
It means "replace ##\mu## by ##\alpha## -- or vice versa".
How can we simply do that?
 
unscientific said:
How can we simply do that?
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
 
strangerep said:
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
Summing up the diagonal, i.e. ##\sum M_{ii}##.
 
unscientific said:
Summing up the diagonal, i.e. ##\sum M_{ii}##.
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
 
strangerep said:
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
 
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}
 
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
 
  • #10
strangerep said:
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.

So, the working is

\nabla_\gamma R^\mu _{\nu \alpha \beta} + \nabla_\beta R^\mu _{\nu \gamma \alpha} + \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0

\delta_\mu ^\alpha \nabla_\gamma R^\mu _{\nu \alpha \beta} + \delta_\mu ^\alpha \nabla_\beta R^\mu _{\nu \gamma \alpha} + \delta_\mu ^\alpha \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0

\nabla_\gamma g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \alpha \beta} + \nabla_\beta g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \gamma \alpha} + \nabla_\alpha g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \beta \gamma} = 0

\nabla_\gamma R^\alpha _{\nu \alpha \beta} + \nabla_\beta R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha R^\alpha _{\nu \beta \gamma} = 0

\nabla_\gamma g^{\nu \gamma} R^\alpha _{\nu \alpha \beta} + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha g^{\nu \gamma} R^\alpha _{\nu \beta \gamma} = 0

2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?
 
  • #11
unscientific said:
So, the working is

2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?

No, because R is defined to be
R^\alpha _\alpha = g^{\nu \gamma} R^\alpha _{\nu \alpha \gamma}
and you can obtain that form by exchange of the last two indices, hence the minus sign
 
Back
Top