I SR invariance index notation algebra ds^{2}

  • Thread starter binbagsss
  • Start date
1,155
6
I am following some lecture notes looking at the invariance of Poincare transformation acting on flat space-time with the minkowski metric:

##x'^{u} = \Lambda ^{u}## ##_{a} x^{a} + a^{u} ## [1], where ##a^{u}## is a constant vector and ##\Lambda^{uv}## is such that it leaves the minkowski metric, ##g_{ab}## invariant.

By invariance I have:

##ds^{2}=g_{uv}dx^{u}dx^{v}=ds'^{2}=dx'^{u}dx'^{v}g_{u'v'}## [2]

from [1] ## dx'^{u} = \Lambda ^{u}## ##_a ## ## dx^{a} ##

Plugging this into [2] I have:

##ds^{2}=g_{uv}dx^{u}dx^{v}=g_{uv}\Lambda^{u}## ##_{m} dx^{m} \Lambda ^{v} ## ##_{n}dx^{n}##

and then we can simply cancel ##g_{uv}## this was my working. and I can't see where it is flawed.

I am also able to follow the lecture notes where the only difference is the choice of indicies and get a different expression as follows:


##g_{uv}dx^{u}dx^{v}=g_{ab}dx'^{a}dx'^{b}##

## =g_{ab}\Lambda^{a}## ##_{\theta}dx^{\theta} \Lambda^{b} ## ##_{\phi} dx^{\phi}##

Now I rename ##\theta=u, \phi=v## and get

##g_{uv}=g_{ab}\Lambda^{a}## ##_{u} \Lambda^{b}## ##_{v}## as in my lecture notes.


Have I broke any index notation rules in the first one? such that my metrics cancel when they should not etc? struggling to see where I've gone wrong. I know there is a rule where you shouldn't have an index appear more than twice but I thought this was only true on the same side of the equation and in the same term, i.e. a 'multiplicative' term not summing separate terms...


Many thanks in advance.
 

strangerep

Science Advisor
2,934
708
Plugging this into [2] I have:

##ds^{2}=g_{uv}dx^{u}dx^{v}=g_{uv}\Lambda^{u}## ##_{m} dx^{m} \Lambda ^{v} ## ##_{n}dx^{n}##
Huh? How did you get that? Did you forget some primes somewhere?

BTW, in your eqn[2], it may be better to write ##g'_{\mu\nu}## rather than ##g_{\mu'\nu'}##.

and then we can simply cancel ##g_{uv}## this was my working. and I can't see where it is flawed.
What do you mean by "we can simply cancel ##g_{uv}##"? You can't "cancel" ##g_{uv}## here.
 
1,155
6
Huh? How did you get that? Did you forget some primes somewhere?

BTW, in your eqn[2], it may be better to write ##g'_{\mu\nu}## rather than ##g_{\mu'\nu'}##..
I had ##g_{uv}## twice without using any primes as I changed from one frame to the other consistent with my comment in the original post that the Minkowski metric is unchanged. (pretty sure this is what my lecturer said...)

What do you mean by "we can simply cancel ##g_{uv}##"? You can't "cancel" ##g_{uv}## here.
Since it appears on both sides of the equation?
Oh thinking of the metric tensor as a matrix, you can't just do this can you...?
 

strangerep

Science Advisor
2,934
708
I had ##g_{uv}## twice without using any primes as I changed from one frame to the other consistent with my comment in the original post that the Minkowski metric is unchanged. (pretty sure this is what my lecturer said...)
I'm pretty sure what your lecturer meant is something like: $$\Lambda^T g' \Lambda ~=~ g ~,$$(where I've switched to matrix notation instead of indices).

I.e., you're supposed to apply the ##\Lambda## transformation to the metric, and then find out that in fact the components of the metric are unchanged if ##\Lambda## is an orthogonal transformation (i.e., if ##\Lambda^T = \Lambda^{-1}##).

Since it appears on both sides of the equation?
Oh thinking of the metric tensor as a matrix, you can't just do this can you...?
That's not the reason. Rather, it's because you're performing a double summation over the indices. (Write out the equation using explicit ``##\Sigma##'' summation notation if this isn't clear.)
 
1,155
6
I'm pretty sure what your lecturer meant is something like: $$\Lambda^T g' \Lambda ~=~ g ~,$$(where I've switched to matrix notation instead of indices).

I.e., you're supposed to apply the ##\Lambda## transformation to the metric, and then find out that in fact the components of the metric are unchanged if ##\Lambda## is an orthogonal transformation (i.e., if ##\Lambda^T = \Lambda^{-1}##).

That's not the reason. Rather, it's because you're performing a double summation over the indices. (Write out the equation using explicit ``##\Sigma##'' summation notation if this isn't clear.)

My lecturer just stated as I did above that Minkowski metric is unchanged.

However I do see that from my incorrect expression in OP by renaming indices ##u< -> m## ,## v <- > n ##, that I obtain the result in lectures, so it was just a matter of renaming dummy indices and not 'cancelling' the ##g_{uv}## ... ?

That's not the reason. Rather, it's because you're performing a double summation over the indices. (Write out the equation using explicit ``##\Sigma##'' summation notation if this isn't clear.)
Isn't matrix multiplication a summation ? hence representing the tensors as matrices, it is the same argument as what you've just said?
 

strangerep

Science Advisor
2,934
708
However I do see that from my incorrect expression in OP by renaming indices ##u< -> m## ,## v <- > n ##, that I obtain the result in lectures, so it was just a matter of renaming dummy indices and not 'cancelling' the ##g_{uv}## ... ?
You're making me try to guess what's in your mind, which is rarely reliable. Perhaps you should write it out again here (as equations) properly so I can be sure.

Isn't matrix multiplication a summation ? hence representing the tensors as matrices, it is the same argument as what you've just said?
To "cancel" a matrix from both sides of an equation, you must multiply both sides by the inverse matrix. To do that, you need a free index. E.g., suppose you have an equation like:$$ A^\mu_{~\nu} v^\nu ~=~ z^\mu ~,$$where ##A## is an invertible matrix and ##v,z## are (column) vectors. Then you can multiply both sides by the inverse matrix ##A^{-1}##, as follows:$$(A^{-1})^\lambda_{~\mu} A^\mu_{~\nu} v^\nu ~=~ (A^{-1})^\lambda_{~\mu} z^\mu ~,$$$$\delta^\lambda_{~\nu} v^\nu ~=~ (A^{-1})^\lambda_{~\mu} z^\mu ~,$$$$ v^\lambda ~=~ (A^{-1})^\lambda_{~\mu} z^\mu ~.$$ But to do that, we needed a free index on both sides (in this case "##\mu##"). But in the expressions for ##ds^2## there are no free indices, only dummy ones.
 
1,155
6
You're making me try to guess what's in your mind, which is rarely reliable. Perhaps you should write it out again here (as equations) properly so I can be sure.
Okay so literally ## u-> m ## and ## n -> v##
So instead of ##g_{uv}dx^{u}dx^{v}=g_{uv}\Lambda^{u}_{m}dx^{m}\Lambda^{v}_{n}dx^{n}##
I have ##g_{uv}dx^{u}dx^{v}=g_{mn}\Lambda^{m}_{u}dx^{u}\Lambda^{n}_{v}dx^{v}##
Then, as my lecture notes did, the ##dx^{u} dx^{v}## cancel from each side of the equation to give the result I was after.

HOWEVER, I've just realised, I have no idea why/how you can cancel the ##dx^{u} dx^{v}##? Since, as you said, the expression consists only of dummy indicies, that is, the ##dx^{u}## and ##dx^{v}## are four-vectors being summed over with ##"\Lambda"## and ##"g"## on the RHS right? wheres they're being summed over just ##"g"## on the LHS, so surely you can't just cancel....? I would have thought the exact same reason just discussed above why ##"g"## doesn't cancels, holds here...
 

Ibix

Science Advisor
Insights Author
4,868
3,191
You don't cancel them. But the expression must be true for arbitrary ##dx^u## and ##dx^v##. The only way that can be true is if the coefficients of each term are the same, which gives you the result that looks like you cancelled the infintesimal vectors.
 
1,155
6
You don't cancel them. But the expression must be true for arbitrary ##dx^u## and ##dx^v##. The only way that can be true is if the coefficients of each term are the same, which gives you the result that looks like you cancelled the infintesimal vectors.
Ahh of course ! thank you !
 

strangerep

Science Advisor
2,934
708
@binbagsss : I sense that you're still not getting it properly. So I guess I'll have to lay it out for you...

##ds## is a scalar, hence independent of the coordinate system you're using, hence ##ds^2 ~=~ ds'^2##. Writing out both sides of this equation in full, we have:$$ g_{\mu\nu} dx^\mu dx^\nu ~=~ g'_{\mu\nu} dx'^\mu dx'^\nu ~~~~~~ [1]~,$$where the primes denote components of quantities referred to the primed coordinate system, etc.

The coordinate transformation between the dx's is ##dx'^\beta = \Lambda^\beta_{~\alpha} dx^\alpha##. Applying this to the RHS of eqn[1] gives $$g_{\mu\nu} dx^\mu dx^\nu ~=~ g'_{\mu\nu} \Lambda^\mu_{~\alpha} dx^\alpha \Lambda^\nu_{~\beta}dx^\beta ~~~~~~ [2] ~.$$ Rearranging the RHS, and renaming dummy indices on the LHS, this becomes $$g_{\alpha\beta} dx^\alpha dx^\beta ~=~ \Lambda^\mu_{~\alpha} g'_{\mu\nu} \Lambda^\nu_{~\beta}dx^\alpha dx^\beta ~~~~~~ [3] ~,$$ which implies $$\left( g_{\alpha\beta} - \Lambda^\mu_{~\alpha} g'_{\mu\nu} \Lambda^\nu_{~\beta}\right) dx^\alpha dx^\beta ~=~ 0 ~~~~~~ [4] ~.$$ Now, Ibix's point in post #8 kicks in, allowing us to deduce: $$g_{\alpha\beta} - \Lambda^\mu_{~\alpha} g'_{\mu\nu} \Lambda^\nu_{~\beta} ~=~ 0 ~~~~~~ [5] ~.$$Notice that at this stage we still have a primed ##g'_{\mu\nu}##, since we've only considered an arbitrary coordinate transformation so far.

If we now also demand that the transformation be such that the metric components are left unchanged, i.e., if we require ##g_{\alpha\beta} = g'_{\alpha\beta}##, then eqn[5] implies a condition on the matrix ##\Lambda##. I'll leave it to you to deduce that this condition is equivalent to ##\Lambda^T = \Lambda^{-1},## (which is satisfied by the usual Lorentz transformations).
 

Want to reply to this thread?

"SR invariance index notation algebra ds^{2}" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top