1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Einstein linearized tensor is zero

  1. Mar 8, 2013 #1
    1. The problem statement, all variables and given/known data
    We have the Einstein tensor [itex] G_{αβ} = R_{αβ} - \frac{1}{2}g_{αβ}R [/itex]
    where [itex] R_{\alpha \beta}, R [/itex] are the Ricci tensor and scalar.

    2. Relevant equations
    We want the metric to be small perturbation of the flat space, so [itex] g_{\alpha \beta} = \eta_{\alpha \beta} + h_{\alpha \beta} [/itex] with [itex] h_{\alpha \beta} [/itex] is a small.

    By definition we use [itex] \eta [/itex] to upper or down indexes.
    So we can write [itex] R = R^\beta_\beta = \eta^{\alpha \beta} R_{\alpha \beta} [/itex]

    3. The attempt at a solution
    Lets substitute in above [itex] G_{\alpha \beta} = R_{\alpha \beta} - \frac{1}{2} \eta_{\alpha \beta} R = R_{\alpha \beta} - \frac{1}{2} \eta_{\alpha \beta}\eta^{\alpha \beta} R_{\alpha \beta} = R_{\alpha \beta}( 1 - \frac{1}{2} \eta^{\alpha_\beta} ) =
    R_{\alpha \beta}( 1 - \frac{1}{2}tr(\eta) ) =
    R_{\alpha \beta}( 1 - \frac{2}{2} ) = R_{\alpha \beta}( 1 - 1 ) = 0[/itex]

    This cannot be... I seem not to see my mistake...
     
  2. jcsd
  3. Mar 8, 2013 #2

    TSny

    User Avatar
    Homework Helper
    Gold Member

    You can't use ##\alpha## and ##\beta## as dummy summation indices for ##R## since ##\alpha## and ##\beta## are already fixed indices. Note the ambiguity in what is being summed in the last term.
     
  4. Mar 8, 2013 #3
    Yes. I see it. Pretty careless handle of indexes. Thanks. So I am stuck in this problem:

    In that line of thought how can I show that Einstein tensor members [itex] G_{00}, G_{0i} [/itex] do not contain second time derivatives of [itex] h [/itex] in linearized theory, when it is given by to second order:

    WRONG([itex]G_{\alpha \beta} = h^{\sigma}_{\mu,\sigma \nu} + h^{\sigma}_{\nu,\sigma \mu} - h_{,\mu \nu} - h_{\mu \nu, \gamma}^{,\gamma} - \eta_{\mu \nu}h^{\mu \nu}_{,\mu \nu} + \eta_{\mu \nu}h^{,\gamma}_{,\gamma}[/itex])READ BELOW

    [itex] G_{\mu \nu} = h^{\sigma}_{\mu,\sigma \nu} + h^{\sigma}_{\nu,\sigma \mu} - h_{,\mu \nu} - h_{\mu \nu, \gamma}^{,\gamma} - \eta_{\mu \nu}h^{\mu \nu}_{,\mu \nu} + \eta_{\mu \nu}h^{,\gamma}_{,\gamma} [/itex]

    The expression was taken from here

    I just cannot see it. Any hints?
     
    Last edited: Mar 8, 2013
  5. Mar 8, 2013 #4

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Still have some indices problems. Note you have the fixed indices ##\alpha## and ##\beta## on the left and yet they don't appear on the right. Also, the next to last term on the right is incorrect because the indices ##\mu## and ##\nu## are each appearing three times. Equation 6.8 in your reference also has this mistake.

    Once you get that corrected, all you need to do is write out carefully the expressions for ##G_{00}## and ##G_{0i}## and you'll see that all terms involving second time derivatives will cancel. You'll need to remember that in the linearized theory, raising or lowering the index 0 will just change the sign of the expression. For example ##h^{0}\:_\mu## = ##-h_{0\mu}##.
     
    Last edited: Mar 8, 2013
  6. Mar 8, 2013 #5
    I will write step by step:

    Riemann tensor: [itex] R_{\alpha \beta \mu \nu} = \frac{1}{2}( h_{\alpha \nu, \beta \mu} + h_{\beta \mu, \alpha \nu} - h_{\alpha \mu, \beta \nu} - h_{\beta \nu, \alpha \mu} )[/itex]

    Ricci tensor: [itex] R_{\beta \nu} = R^{\mu}_{\beta \mu \nu} = \eta^{\alpha \mu} R_{\alpha \beta \mu \nu} = \frac{1}{2}( h^{\mu}_{\nu, \beta \mu} + h^{\alpha}_{\beta, \alpha \nu} - h^{\alpha}_{\alpha, \beta \nu} - h^{,\mu}_{\beta \nu, \mu})[/itex]

    Ricci scalar: [itex] R = R^{\nu}_{\nu} = \eta^{\beta \nu} R_{\beta \nu} = \frac{1}{2}( h^{\mu \beta}_{, \beta \mu} + h^{\alpha \nu}_{, \alpha \nu} - h^{\alpha, \beta }_{\alpha, \beta} - h^{\beta,\mu}_{\beta, \mu}) [/itex]

    If I learned from my previous mistake it would be stupid if I make changes in the following manner:
    [itex] h^{\alpha, \beta }_{\alpha, \beta} = tr(h)^{,\beta}_{,\beta} = tr(h)^{,\gamma}_{,\gamma} = h^{\sigma, \gamma}_{\sigma, \gamma} [/itex] because there are indexes involved with the initial indexing.
    This would mean it is correct to introduce repeating indexes only if they weren't included in the beginning. So a correct would be something like this [itex] T^{\alpha}_{\beta} + tr(h) + tr(\eta) = T^{\alpha}_{\beta} + h^{\gamma}_{\gamma} + \eta^{\sigma}_{\sigma} = T^{\alpha}_{\beta} + h^{\lambda}_{\lambda} + \eta^{\lambda}_{\lambda} [/itex]

    Let's continue with Einstein tensor: [itex] G_{\beta \nu} = R_{\beta \nu} - \frac{1}{2}\eta_{\beta \nu} R [/itex] Now we should just substitute the above things here, but should with some other dummy indexes, so it won't be like above but for example [itex] R = R^{\gamma}_{\gamma} = \eta^{\sigma \gamma} R_{\sigma \gamma} [/itex] but like this there won't be any shared indexes. I mean the [itex] \mu, \alpha [/itex] indexes or I could add them like dummy indexes... dunno

    EDIT: Find similar problem. I am not the only one.

    EDIT: More helpful than the Einstein notation seems to be Ricci calculus
     
    Last edited: Mar 8, 2013
  7. Mar 8, 2013 #6

    TSny

    User Avatar
    Homework Helper
    Gold Member

    I think all the above is correct.
    The above equation doesn't make sense because you can't add a scalar quantity like Trace(h) to a tensor quantity like Tβα.
    I believe what you're saying here is ok, although I don't know what ##\mu## and ##\alpha## indices you are referring to.
     
  8. Mar 9, 2013 #7
    So I am getting: [itex] G_{\alpha \beta} = R_{\alpha \beta} - \frac{1}{2}\eta_{\alpha \beta}R = \frac{1}{2}(h^{\gamma}_{\alpha, \beta \gamma} + h^{\gamma}_{\beta, \alpha \gamma} - h_{,\alpha \beta} - h^{,\gamma}_{\alpha \beta, \gamma} ) - \frac{1}{2} \eta_{\alpha \beta} \frac{1}{2}( h^{\lambda \sigma}_{,\lambda \sigma} + h^{\lambda \gamma}_{,\lambda \gamma} - h^{,\gamma}_{,\gamma} - h^{,\lambda}_{,\lambda )} [/itex]

    Since linear approximation and we can ignore the derivatives of the traces because of Jacobi formula [itex] h,{\mu} = h h^{\alpha \beta} h_{\alpha \beta, \mu} [/itex] ?

    So we are left with: [itex] G_{\alpha \beta} = \frac{1}{2}(h^{\gamma}_{\alpha, \beta \gamma} + h^{\gamma}_{\beta, \alpha \gamma} - h^{,\gamma}_{\alpha \beta, \gamma} ) - \frac{1}{2} \eta_{\alpha \beta} \frac{1}{2}( h^{\lambda \sigma}_{,\lambda \sigma} + h^{\lambda \gamma}_{,\lambda \gamma} )[/itex]

    Can I do something about the last member ? I am looking at Shutz expression and he has just one like [itex] \eta_{\alpha \beta} h^{\mu \nu}_{,\mu \nu} [/itex] ?

    So at this point I should see that the time derivatives are vanishing ?
     
  9. Mar 9, 2013 #8

    TSny

    User Avatar
    Homework Helper
    Gold Member

    In the last term on the right, inside the parentheses, note that the first two terms are identically equal to each other and the last two terms are equal to each other. So, you might as well combine them and get rid of one of the factors of 1/2.
    I don't think this identity is correctly written here and I don't think you can ignore the derivatives of the trace, h. Keeping the derivatives of the trace, you should still be able to show that if you isolate all the terms of ##G_{0 0}## that involve second derivatives of time, that these terms will cancel out.
     
  10. Mar 13, 2013 #9
    I wasn't sure if I could do that.

    So by getting rid of the one [itex] \frac{1}{2} [/itex] in front of the last term and making the change: [itex] \alpha \beta → 0 \beta ; \gamma → 0 + k [/itex] I get
    [itex] G_{0 \beta } = h^{0}_{0,\beta 0} + h^{k}_{0,\beta k} + h^{0}_{\beta,00} + h^{k}_{\beta ,0k} - h_{,0 \beta} - h_{0 \beta,0}^{,0} - h_{0 \beta, k}^{,k} - \eta_{0 \beta} h^{0 0}_{,00} - \eta_{0 \beta} h^{ik}_{,ik} +\eta_{0 \beta} h_{,0}^{,0} + \eta_{0 \beta} h_{,k}^{,k} [/itex]
    Now I will neglect the terms without time derivative and take advantage of the fact [itex] h^0_\alpha = \eta^{0 \mu} h_{0 \alpha} = - h_{0 \alpha}[/itex] and obtain:
    [itex] G_{0 \beta} = -h_{00,\beta 0} - h_{0 \beta,00} - h_{k \beta,0 k} - h_{0 \beta} + h_{0 \beta, 00 } + h_{00, 00} + h_{,00} =\\
    -h_{00,0 0} -h_{00,i 0} - h_{0 \beta,00} - h_{k \beta,0 k} - h_{,0 0} - h_{,0 i} + h_{0 0, 00 } + h_{0 i, 00 } + h_{00, 00} + h_{,00} = -h_{00,i 0} - -h_{k \beta,0 k} -h_{,0i} [/itex]
    Does this prove the point? I am not sure, because I expected that there would be no time derivatives.
     
  11. Mar 13, 2013 #10

    TSny

    User Avatar
    Homework Helper
    Gold Member

    That's getting close. There will still be an overall factor of 1/2 for the right hand side, but that's not important in what you want to show. Also, you have left out terms of the form ##\eta_{0 \beta} h^{i0}_{,i0}##. But they involve only first order derivatives in time and are not important in what you want to show.
    In going from the first to second line in the quote above, it looks like you have treated ##\beta## as a summation variable on the right. But ##\beta## is a fixed index. What you want to do is write out the first line in the quote above for ##\beta = 0## and show that all second order time derivatives cancel. Then you want to repeat for the case where ##\beta = j## for a fixed spatial index ##j##.
     
  12. Mar 13, 2013 #11
    This second order derivatives are just [itex] \partial_{0}\partial_{0} := \partial^{2}_{0} [/itex] and doesn't include [itex] \partial_{something} \partial_{0} \equiv \partial_{0} \partial_{something} [/itex]

    Somewhat off-topic: I got bit confused about the following: the metric is [itex] h_{\mu \nu} [/itex], the inverse is [itex] h^{\mu \nu}=\frac{1}{h_{\mu \nu}} [/itex] and the determinant is is [itex] h = det(h) [/itex]. But for some reason I got the impression that [itex] h [/itex] is used for the trace too. Is it because I assume it is diagonal or because I have to read closer.
     
  13. Mar 13, 2013 #12

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Right, as long a "something" is something other than 0.
    Sorry for being pedantic, but the metric is ##g_{\mu\nu} = \eta_{\mu\nu} + h_{\mu\nu}##. So ##h_{\mu\nu}## is only part of the metric.
    The inverse of the matrix of ##h_{\mu\nu}## is not the matrix ##h^{\mu\nu}##. You are probably thinking of the metric matrix ##g_{\mu\nu}## whose inverse is the matrix ##g^{\mu\nu}##. But that doesn't apply to ##h_{\mu\nu}##. Also, the inverse of the matrix ##h_{\mu\nu}## is not the matrix with elements ##1/h^{\mu\nu}## unless ##h_{\mu\nu}## is diagonal.
    Here, ##h## represents the trace, not the determinant.
     
  14. Mar 17, 2013 #13
    Yes, that was the hardest for me to understand. Thanks.

    My mistake I.

    My mistake II.

    Mistake I and II are because I confused [itex] g_{\mu \nu }[/itex] and [itex] h_{\mu \nu} [/itex]. Thanks for pointing me out this things. A follow up is the mistake that I can not apply the Jacobi formula for derivative of determinant to trace [itex] h [/itex].

    [itex] 2G_{00} = h^{\gamma}_{0,0\gamma} + h^{\gamma}_{0,0\gamma} - h_{,00} - h^{,\gamma}_{00,\gamma} - \eta_{00} h^{\gamma \sigma}_{,\gamma \sigma} + \eta_{00} h^{,\gamma}_{,\gamma} = -2h_{\gamma 0,0 \gamma} - h_{,00} + h_{00,\gamma \gamma} + h_{\gamma \sigma, \gamma \sigma } + h_{,\gamma \gamma} = \\
    -2h_{00,00} - h_{,00} + h_{00,00} + h_{00,00 } + h_{,00} +.... = 0 + ....[/itex]
    [itex] 2G_{0k} = h^{\gamma}_{0,k\gamma} + h^{\gamma}_{k,0\gamma} - h_{,0k} - h^{,\gamma}_{0k,\gamma} - \eta_{0k} h^{\gamma \sigma}_{,\gamma \sigma} + \eta_{0k} h^{,\gamma}_{,\gamma} = -h_{\gamma 0,k\gamma} -h_{\gamma k,0\gamma} - h_{,0k} + h_{0k,\gamma \gamma} - 0 + 0 = \\
    -h_{0k,00} + h_{0k,00} + ....= 0 + ....[/itex]
     
  15. Mar 17, 2013 #14

    TSny

    User Avatar
    Homework Helper
    Gold Member

    OK. However, I would quibble with the way you handled a couple of terms even though you got the correct final result.

    For example, note that you wrote ##h^{\gamma}\,_{0,0\gamma} = -h_{\gamma 0, 0\gamma}##. This is not a correct step. ##\gamma## is a summation index which must be written once up and once down. Thus

    ##h^{\gamma}\,_{0,0\gamma} = h^{0}\,_{0,00} + h^{1}\,_{0,01} + h^{2}\,_{0,02} +h^{3}\,_{0,03} = -h_{00,00} + h_{10,01}+ h_{20,02}+ h_{30,03}.##

    Only the first term on the right changes sign when lowering the upper index.

    Otherwise, it looks good.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Einstein linearized tensor is zero
Loading...