Tensor Contraction: Contracting ##\mu## with ##\alpha##?

Click For Summary

Discussion Overview

The discussion revolves around the concept of tensor contraction, specifically addressing the contraction of indices ##\mu## and ##\alpha## in tensor equations. Participants explore the implications of this operation in the context of both matrices and tensors, examining the mathematical processes involved and the assumptions underlying these operations.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants question the meaning of "contract ##\mu## with ##\alpha##", suggesting that it may imply replacing one index with the other.
  • Others argue that contraction is not simply about cancellation of indices, but involves specific mathematical operations, such as taking traces in the context of matrices.
  • A participant provides an example involving a matrix equation to illustrate the concept of contraction over indices.
  • There is a discussion about the working of tensor contraction, with participants providing equations and steps involved in the process.
  • Some participants highlight the importance of the covariant derivative and the assumption of the metric being covariantly constant in General Relativity when discussing contractions.
  • There is a debate regarding the signs in the final equations, with one participant questioning the correctness of a term and another providing a rationale for the existing formulation.

Areas of Agreement / Disagreement

Participants express differing views on the nature of tensor contraction, with some emphasizing the mathematical operations involved while others focus on the conceptual understanding. The discussion remains unresolved regarding the specifics of the contraction process and the implications of the signs in the equations.

Contextual Notes

Limitations include assumptions about the definitions of contraction and the properties of tensors and matrices, as well as the specific context of General Relativity that may not be universally applicable.

unscientific
Messages
1,728
Reaction score
13
tensor4.png


What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
 
Physics news on Phys.org
unscientific said:
What do they mean by 'Contract ##\mu## with ##\alpha##'?
It means "replace ##\mu## by ##\alpha## -- or vice versa".
 
strangerep said:
It means "replace ##\mu## by ##\alpha## -- or vice versa".
How can we simply do that?
 
unscientific said:
How can we simply do that?
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
 
strangerep said:
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
Summing up the diagonal, i.e. ##\sum M_{ii}##.
 
unscientific said:
Summing up the diagonal, i.e. ##\sum M_{ii}##.
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
 
strangerep said:
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
 
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
[tex]R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}[/tex]
 
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
 
  • #10
strangerep said:
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.

So, the working is

[tex]\nabla_\gamma R^\mu _{\nu \alpha \beta} + \nabla_\beta R^\mu _{\nu \gamma \alpha} + \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0[/tex]

[tex]\delta_\mu ^\alpha \nabla_\gamma R^\mu _{\nu \alpha \beta} + \delta_\mu ^\alpha \nabla_\beta R^\mu _{\nu \gamma \alpha} + \delta_\mu ^\alpha \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0[/tex]

[tex]\nabla_\gamma g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \alpha \beta} + \nabla_\beta g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \gamma \alpha} + \nabla_\alpha g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \beta \gamma} = 0[/tex]

[tex]\nabla_\gamma R^\alpha _{\nu \alpha \beta} + \nabla_\beta R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha R^\alpha _{\nu \beta \gamma} = 0[/tex]

[tex]\nabla_\gamma g^{\nu \gamma} R^\alpha _{\nu \alpha \beta} + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha g^{\nu \gamma} R^\alpha _{\nu \beta \gamma} = 0[/tex]

[tex]2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0[/tex]

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?
 
  • #11
unscientific said:
So, the working is

[tex]2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0[/tex]

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?

No, because R is defined to be
[tex]R^\alpha _\alpha = g^{\nu \gamma} R^\alpha _{\nu \alpha \gamma}[/tex]
and you can obtain that form by exchange of the last two indices, hence the minus sign
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 9 ·
Replies
9
Views
838
  • · Replies 38 ·
2
Replies
38
Views
3K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K