unscientific
- 1,728
- 13
What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
The discussion focuses on the concept of tensor contraction, specifically contracting the indices ##\mu## and ##\alpha## in the context of General Relativity (GR). Participants clarify that this process involves replacing one index with another rather than simple cancellation. The working examples provided include the equation ##R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu}## and the use of covariant derivatives, emphasizing the importance of the metric's constancy in GR. The final equations illustrate the detailed steps involved in achieving the contraction.
PREREQUISITESMathematicians, physicists, and students specializing in General Relativity, tensor calculus, or advanced mathematical physics will benefit from this discussion.
It means "replace ##\mu## by ##\alpha## -- or vice versa".unscientific said:What do they mean by 'Contract ##\mu## with ##\alpha##'?
How can we simply do that?strangerep said:It means "replace ##\mu## by ##\alpha## -- or vice versa".
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?unscientific said:How can we simply do that?
Summing up the diagonal, i.e. ##\sum M_{ii}##.strangerep said:Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
Correct.unscientific said:Summing up the diagonal, i.e. ##\sum M_{ii}##.
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.strangerep said:Correct.
So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.
Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}unscientific said:Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
Indeed, it is not index "cancellation".unscientific said:Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
strangerep said:Indeed, it is not index "cancellation".
Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
unscientific said:So, the working is
2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0
Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?