Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Quick one-line on Tensor Contraction

  1. Feb 28, 2015 #1
    tensor4.png

    What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
     
  2. jcsd
  3. Feb 28, 2015 #2

    strangerep

    User Avatar
    Science Advisor

    It means "replace ##\mu## by ##\alpha## -- or vice versa".
     
  4. Feb 28, 2015 #3
    How can we simply do that?
     
  5. Feb 28, 2015 #4

    strangerep

    User Avatar
    Science Advisor

    Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
     
  6. Feb 28, 2015 #5
    Summing up the diagonal, i.e. ##\sum M_{ii}##.
     
  7. Feb 28, 2015 #6

    strangerep

    User Avatar
    Science Advisor

    Correct.

    So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

    Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
     
  8. Feb 28, 2015 #7

    Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
     
  9. Feb 28, 2015 #8

    samalkhaiat

    User Avatar
    Science Advisor

    [tex]R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}[/tex]
     
  10. Feb 28, 2015 #9

    strangerep

    User Avatar
    Science Advisor

    Indeed, it is not index "cancellation".

    Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
     
  11. Mar 1, 2015 #10
    So, the working is

    [tex] \nabla_\gamma R^\mu _{\nu \alpha \beta} + \nabla_\beta R^\mu _{\nu \gamma \alpha} + \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0 [/tex]

    [tex] \delta_\mu ^\alpha \nabla_\gamma R^\mu _{\nu \alpha \beta} + \delta_\mu ^\alpha \nabla_\beta R^\mu _{\nu \gamma \alpha} + \delta_\mu ^\alpha \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0 [/tex]

    [tex] \nabla_\gamma g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \alpha \beta} + \nabla_\beta g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \gamma \alpha} + \nabla_\alpha g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \beta \gamma} = 0 [/tex]

    [tex] \nabla_\gamma R^\alpha _{\nu \alpha \beta} + \nabla_\beta R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha R^\alpha _{\nu \beta \gamma} = 0 [/tex]

    [tex] \nabla_\gamma g^{\nu \gamma} R^\alpha _{\nu \alpha \beta} + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha g^{\nu \gamma} R^\alpha _{\nu \beta \gamma} = 0 [/tex]

    [tex] 2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0 [/tex]

    Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?
     
  12. Mar 1, 2015 #11
    No, because R is defined to be
    [tex] R^\alpha _\alpha = g^{\nu \gamma} R^\alpha _{\nu \alpha \gamma} [/tex]
    and you can obtain that form by exchange of the last two indices, hence the minus sign
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Quick one-line on Tensor Contraction
  1. Contracting tensors (Replies: 1)

  2. Tensor Contraction (Replies: 4)

Loading...