# Lorentz Invariance

1. Feb 5, 2009

1. The problem statement, all variables and given/known data
Show that $$a^{\mu}b_{\mu} \equiv -a^0b^0 + \vec{a} \bullet \vec{b}$$ is invariant under Lorentz transformations.

2. Relevant equations
$$\Lambda_{\nu}^{\mu} \equiv \left( \begin{array}{cccc} \gamma & -\beta \gamma & 0 & 0 \\ -\beta \gamma & \gamma & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{array} \right)$$
$$b^0 = -b_0$$

3. The attempt at a solution

So the only way I could get it to be invariant was performing a lorentz transformation on each $$a^{\mu}b_{\mu}$$, squaring each, and then setting the first term negative and adding them. I don't know why this works, however.:

$$\Lambda_{\nu}^{\mu}a^{\mu}b_{\mu} = \left( \begin{array}{cccc} \gamma & -\beta \gamma & 0 & 0 \\ -\beta \gamma & \gamma & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{array} \right) \left( \begin{array}{c} a^0b_0 \\ a^1b_1 \\ a^2b_2 \\ a^3b_3 \end{array} \right) = \left( \begin{array}{c} \gamma (a^0b_0 - \beta a^1b_1) \\ \gamma (a^1_b1 - \beta a^0b_0) \\ a^2b_2 \\ a^3b_3 \end{array} \right)$$
Squaring each term, negating first term squared, and summing:
$$\gamma^2(-(a^0b_0)^2-\beta^2 (a^1b_1)^2 + 2\beta a^1b_1a^0b_0 + (a^1b_1)^2+\beta^2 (a^0b_0)^2 - 2\beta a^1b_1a^0b_0) + (a^2b_2)^2 + (a^3b_3)^2$$
Doubled terms cancel, and the others can be grouped so that they are multiplied by $$1-\beta^2$$ which cancels the $$\gamma^2$$, and you are left with the original equation again, albeit with terms squared.

Any help would be great!

2. Feb 6, 2009

### malawi_glenn

$$\Lambda^{\mu}{}_{\nu}a^{\mu}b_{\mu}$$ is non-sense, be careful on how many times each index shows up!

And $$a^{\mu}b_{\mu} \equiv -a^0b^0 + \vec{a} \cdot \vec{b}$$ is not a vector as you have indictated.

The quantity is a scalar, you have no index which you can boost, hence it is a invarnant under lorentz transformation

3. Feb 6, 2009

Ok so I guess it should have been $$a'^{\mu}b'_{\mu} = \Lambda^{\mu}_{\nu}a^{\nu}b_{\nu}$$. I'm aware that the initial equation is a scalar, I was merely turning it into a vector so I could easily transform it. I then squared each component, negated the first and summed them. I guess I don't really understand what you mean by boosting an index.

4. Feb 6, 2009

### malawi_glenn

but now you are transforming a, not (ab)...

turning it into a vector? how can you turn a scalar into one vector?

What you can do I think, if not the argument that it is a scalar and has no free index which a LT can contract, then you can combine a transformed 'a' with a transformed 'b'.

I mean you are show that a Lorentz scalar is invariant under a lorentz transformation, there is nothing to be done, who gave you this 'silly' task?

5. Feb 6, 2009

6. Feb 6, 2009

### malawi_glenn

what they did is

$$(dS')^2 = x^{\mu '} x_{\mu '} = x^{\mu }x_{\nu}\Lambda ^{\mu '}{}_\mu \Lambda ^{\nu }{}_{\mu '} = x^{\mu }x_{\nu}\delta ^\nu {}_\mu = x^{\mu } x_{\mu } = (dS)^2$$

Since we have the following identity:
$$\Lambda ^{\mu}{}_\sigma\Lambda ^{\sigma }{}_{\nu} = \delta ^\nu {}_\mu$$

now try same with a and b

Last edited: Feb 6, 2009
7. Feb 6, 2009

### Ben Niehoff

The problem is asking you to transform each vector individually, and then show that the dot product is still the same.

The page you linked to shows the longest, most drawn-out way to do this problem. Try doing it in matrix notation. Remember that

$$a_{\mu}b^{\mu} = a^Tb$$

and

$$(AB)^T = B^TA^T$$

8. Feb 6, 2009

Thanks for all the help, but I'm still confused.

I don't see how I can assume that $$\Lambda^{\mu}{}_{\nu}\Lambda^{\sigma}{}_{\mu} = \delta^{\sigma}{}_{\nu}$$. I could see how I can assume this if what I'm trying to prove is true, but I can't assume this if I'm trying to prove it. How am I supposed to transform a covariant 4-vector? If I transform the contra-variant and covariant vectors with a lorentz transformation and dot them I don't get back the original equation. I did this out explicitly with matrix multiplication since I'm not too comfortable with the notation just yet.

9. Feb 6, 2009

### malawi_glenn

$$\Lambda^{\mu}{}_{\nu}\Lambda^{\sigma}{}_{\mu} = \delta^{\sigma}{}_{\nu}$$

Should be done in your course, otherwise it is not a good course in SR.

Consider a LT which takes x to x' followed by the inverse transformation: the LT which takes x' back to x.

First LT:

$$x^{\mu'} = \Lambda ^{\mu'} {}_\mu x^{\mu} = ( \mu \text{ is dummy index }) = \Lambda ^{\mu'} {}_\nu x^{\nu}$$

The inverse LT:

$$x^{\mu} = \Lambda ^{\mu}{} _{\mu '} x^{\mu '} \Rightarrow$$
$$x^{\mu} = \Lambda ^{\mu}{} _{\mu '} \Lambda ^{\mu'} {}_\nu x^{\nu} = \delta ^\mu {}_\nu x^\nu$$

Now I was perhaps TOO nice

The covariant vector transforms as the inverse of a contravariant:
$$v_{\alpha '} = \Lambda ^{\alpha }{}_{\alpha '} v_{\alpha }$$

Are you telling me that you are learning SR in a course where this has not been introduced? =/

All one needs to do is keeping track of the indices, one up and one down contracts, and so on.

Last edited: Feb 6, 2009
10. Feb 6, 2009

### Ben Niehoff

To do it algebraically, you need to remember that

$$\gamma = \frac{1}{\sqrt{1-\beta^2}}$$

You will probably get expressions that look different, but will turn out to be equal using the above.

To transform a covariant 4-vector, you can always use the metric like so:

$$v_{\mu} = \eta_{\mu\nu} v^{\nu}$$

$$v'_{\mu} = \eta_{\mu\nu} v'^{\nu} = \eta_{\mu\nu} \Lambda^{\nu}_{\sigma} v^{\sigma} = \eta_{\mu\nu} \Lambda^{\nu}_{\sigma} \eta^{\sigma\lambda} v_{\lambda}$$

Then what remains is to find

$$\eta_{\mu\nu} \Lambda^{\nu}_{\sigma} \eta^{\sigma\lambda}$$

which shouldn't be hard.

Last edited: Feb 6, 2009
11. Feb 6, 2009

Ok I think I've got it! My professor didn't explain it quite like that, the way you put it makes much more sense.
So this is what I've got:

$$a'^{\mu} = \Lambda^{\mu}{}_{\nu}a^{\nu} \text{ and } b'_{\mu} = \Lambda^{\mu}_{\nu}b_{\nu}$$
$$a'^{\mu}b'_{\mu} = \Lambda^{\mu}{}_{\nu}\Lambda^{\nu}_{\sigma}a^{\nu}b_{\sigma} = \delta^{\mu}{}_{\sigma}a^{\nu}b_{\sigma} = a^{\mu}b_{\mu}$$
One question: the second part of the question asks to show how the first part implies that the covariant transforms as the inverse. So it seems I'm assuming this for the first part. Is there any way I could avoid this?

12. Feb 6, 2009

### malawi_glenn

ok, I'll give you a hint:

$$a^\mu b_\mu = a^{\mu '} b_{\mu '}$$ Since that is what you want to proove.

Now you know that $$\Lambda ^{\mu}{}_{\mu '}\Lambda ^{\mu ' }{}_{\nu} = \delta ^\nu {}_\mu$$

So just consider the contravariant vector a, insert this delta - and then you'll find that ab is invariant only if the covariant vector b transforms according to the inverse LT.

Last edited: Feb 6, 2009
13. Feb 6, 2009

### malawi_glenn

That is wrong, you have 3 \nu, which is complete non-sense. and also 2 sigma as contravariant index... be carefull!

14. Feb 6, 2009

### malawi_glenn

You will probably love this text:

http://www.teorfys.uu.se/people/minahan/Courses/SR/tensors.pdf [Broken]

Last edited by a moderator: May 4, 2017
15. Feb 6, 2009

Thanks for that last link, it has cleared up a whole mess of confusion. I think I finally have it this time:
$$a^{\mu '} = \Lambda^{\mu '}{}_{\mu}a^{\mu}$$
$$b_{\mu '} = \Lambda^{\mu}{}_{\mu '}b_{\mu}$$
Since it's covariant, use inverse matrix.

Then:
$$a^{\mu '}b_{\mu '} = \Lambda^{\mu '}{}_{\mu}a^{\mu}\Lambda^{\nu}{}_{\mu '}b_{\nu} = \delta^{\mu}{}_{\nu}a^{\mu}b_{\nu} = a^{\mu}b_{\mu}$$

Thanks so much for your help.

16. Feb 6, 2009

### malawi_glenn

Now it looks much better! Good job! I hope you understood that you can start with the delta and than show that only if covariant b transforms as inverse LT the quantity ab is invariant under LT.