Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I SR invariance index notation algebra ds^{2}

  1. Oct 15, 2016 #1
    I am following some lecture notes looking at the invariance of Poincare transformation acting on flat space-time with the minkowski metric:

    ##x'^{u} = \Lambda ^{u}## ##_{a} x^{a} + a^{u} ## [1], where ##a^{u}## is a constant vector and ##\Lambda^{uv}## is such that it leaves the minkowski metric, ##g_{ab}## invariant.

    By invariance I have:

    ##ds^{2}=g_{uv}dx^{u}dx^{v}=ds'^{2}=dx'^{u}dx'^{v}g_{u'v'}## [2]

    from [1] ## dx'^{u} = \Lambda ^{u}## ##_a ## ## dx^{a} ##

    Plugging this into [2] I have:

    ##ds^{2}=g_{uv}dx^{u}dx^{v}=g_{uv}\Lambda^{u}## ##_{m} dx^{m} \Lambda ^{v} ## ##_{n}dx^{n}##

    and then we can simply cancel ##g_{uv}## this was my working. and I can't see where it is flawed.

    I am also able to follow the lecture notes where the only difference is the choice of indicies and get a different expression as follows:


    ##g_{uv}dx^{u}dx^{v}=g_{ab}dx'^{a}dx'^{b}##

    ## =g_{ab}\Lambda^{a}## ##_{\theta}dx^{\theta} \Lambda^{b} ## ##_{\phi} dx^{\phi}##

    Now I rename ##\theta=u, \phi=v## and get

    ##g_{uv}=g_{ab}\Lambda^{a}## ##_{u} \Lambda^{b}## ##_{v}## as in my lecture notes.


    Have I broke any index notation rules in the first one? such that my metrics cancel when they should not etc? struggling to see where I've gone wrong. I know there is a rule where you shouldn't have an index appear more than twice but I thought this was only true on the same side of the equation and in the same term, i.e. a 'multiplicative' term not summing separate terms...


    Many thanks in advance.
     
  2. jcsd
  3. Oct 15, 2016 #2

    strangerep

    User Avatar
    Science Advisor

    Huh? How did you get that? Did you forget some primes somewhere?

    BTW, in your eqn[2], it may be better to write ##g'_{\mu\nu}## rather than ##g_{\mu'\nu'}##.

    What do you mean by "we can simply cancel ##g_{uv}##"? You can't "cancel" ##g_{uv}## here.
     
  4. Oct 16, 2016 #3
    I had ##g_{uv}## twice without using any primes as I changed from one frame to the other consistent with my comment in the original post that the Minkowski metric is unchanged. (pretty sure this is what my lecturer said...)

    Since it appears on both sides of the equation?
    Oh thinking of the metric tensor as a matrix, you can't just do this can you...?
     
  5. Oct 16, 2016 #4

    strangerep

    User Avatar
    Science Advisor

    I'm pretty sure what your lecturer meant is something like: $$\Lambda^T g' \Lambda ~=~ g ~,$$(where I've switched to matrix notation instead of indices).

    I.e., you're supposed to apply the ##\Lambda## transformation to the metric, and then find out that in fact the components of the metric are unchanged if ##\Lambda## is an orthogonal transformation (i.e., if ##\Lambda^T = \Lambda^{-1}##).

    That's not the reason. Rather, it's because you're performing a double summation over the indices. (Write out the equation using explicit ``##\Sigma##'' summation notation if this isn't clear.)
     
  6. Oct 17, 2016 #5

    My lecturer just stated as I did above that Minkowski metric is unchanged.

    However I do see that from my incorrect expression in OP by renaming indices ##u< -> m## ,## v <- > n ##, that I obtain the result in lectures, so it was just a matter of renaming dummy indices and not 'cancelling' the ##g_{uv}## ... ?

    Isn't matrix multiplication a summation ? hence representing the tensors as matrices, it is the same argument as what you've just said?
     
  7. Oct 17, 2016 #6

    strangerep

    User Avatar
    Science Advisor

    You're making me try to guess what's in your mind, which is rarely reliable. Perhaps you should write it out again here (as equations) properly so I can be sure.

    To "cancel" a matrix from both sides of an equation, you must multiply both sides by the inverse matrix. To do that, you need a free index. E.g., suppose you have an equation like:$$ A^\mu_{~\nu} v^\nu ~=~ z^\mu ~,$$where ##A## is an invertible matrix and ##v,z## are (column) vectors. Then you can multiply both sides by the inverse matrix ##A^{-1}##, as follows:$$(A^{-1})^\lambda_{~\mu} A^\mu_{~\nu} v^\nu ~=~ (A^{-1})^\lambda_{~\mu} z^\mu ~,$$$$\delta^\lambda_{~\nu} v^\nu ~=~ (A^{-1})^\lambda_{~\mu} z^\mu ~,$$$$ v^\lambda ~=~ (A^{-1})^\lambda_{~\mu} z^\mu ~.$$ But to do that, we needed a free index on both sides (in this case "##\mu##"). But in the expressions for ##ds^2## there are no free indices, only dummy ones.
     
  8. Oct 21, 2016 #7
    Okay so literally ## u-> m ## and ## n -> v##
    So instead of ##g_{uv}dx^{u}dx^{v}=g_{uv}\Lambda^{u}_{m}dx^{m}\Lambda^{v}_{n}dx^{n}##
    I have ##g_{uv}dx^{u}dx^{v}=g_{mn}\Lambda^{m}_{u}dx^{u}\Lambda^{n}_{v}dx^{v}##
    Then, as my lecture notes did, the ##dx^{u} dx^{v}## cancel from each side of the equation to give the result I was after.

    HOWEVER, I've just realised, I have no idea why/how you can cancel the ##dx^{u} dx^{v}##? Since, as you said, the expression consists only of dummy indicies, that is, the ##dx^{u}## and ##dx^{v}## are four-vectors being summed over with ##"\Lambda"## and ##"g"## on the RHS right? wheres they're being summed over just ##"g"## on the LHS, so surely you can't just cancel....? I would have thought the exact same reason just discussed above why ##"g"## doesn't cancels, holds here...
     
  9. Oct 21, 2016 #8

    Ibix

    User Avatar
    Science Advisor

    You don't cancel them. But the expression must be true for arbitrary ##dx^u## and ##dx^v##. The only way that can be true is if the coefficients of each term are the same, which gives you the result that looks like you cancelled the infintesimal vectors.
     
  10. Oct 21, 2016 #9
    Ahh of course ! thank you !
     
  11. Oct 21, 2016 #10

    strangerep

    User Avatar
    Science Advisor

    @binbagsss : I sense that you're still not getting it properly. So I guess I'll have to lay it out for you...

    ##ds## is a scalar, hence independent of the coordinate system you're using, hence ##ds^2 ~=~ ds'^2##. Writing out both sides of this equation in full, we have:$$ g_{\mu\nu} dx^\mu dx^\nu ~=~ g'_{\mu\nu} dx'^\mu dx'^\nu ~~~~~~ [1]~,$$where the primes denote components of quantities referred to the primed coordinate system, etc.

    The coordinate transformation between the dx's is ##dx'^\beta = \Lambda^\beta_{~\alpha} dx^\alpha##. Applying this to the RHS of eqn[1] gives $$g_{\mu\nu} dx^\mu dx^\nu ~=~ g'_{\mu\nu} \Lambda^\mu_{~\alpha} dx^\alpha \Lambda^\nu_{~\beta}dx^\beta ~~~~~~ [2] ~.$$ Rearranging the RHS, and renaming dummy indices on the LHS, this becomes $$g_{\alpha\beta} dx^\alpha dx^\beta ~=~ \Lambda^\mu_{~\alpha} g'_{\mu\nu} \Lambda^\nu_{~\beta}dx^\alpha dx^\beta ~~~~~~ [3] ~,$$ which implies $$\left( g_{\alpha\beta} - \Lambda^\mu_{~\alpha} g'_{\mu\nu} \Lambda^\nu_{~\beta}\right) dx^\alpha dx^\beta ~=~ 0 ~~~~~~ [4] ~.$$ Now, Ibix's point in post #8 kicks in, allowing us to deduce: $$g_{\alpha\beta} - \Lambda^\mu_{~\alpha} g'_{\mu\nu} \Lambda^\nu_{~\beta} ~=~ 0 ~~~~~~ [5] ~.$$Notice that at this stage we still have a primed ##g'_{\mu\nu}##, since we've only considered an arbitrary coordinate transformation so far.

    If we now also demand that the transformation be such that the metric components are left unchanged, i.e., if we require ##g_{\alpha\beta} = g'_{\alpha\beta}##, then eqn[5] implies a condition on the matrix ##\Lambda##. I'll leave it to you to deduce that this condition is equivalent to ##\Lambda^T = \Lambda^{-1},## (which is satisfied by the usual Lorentz transformations).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: SR invariance index notation algebra ds^{2}
  1. Show that ds^2 = ds'^2 (Replies: 14)

Loading...