• Support PF! Buy your school textbooks, materials and every day products Here!

Lorentz Invariance

1. Homework Statement
Show that [tex]a^{\mu}b_{\mu} \equiv -a^0b^0 + \vec{a} \bullet \vec{b} [/tex] is invariant under Lorentz transformations.


2. Homework Equations
[tex]\Lambda_{\nu}^{\mu} \equiv \left(
\begin{array}{cccc}
\gamma & -\beta \gamma & 0 & 0 \\
-\beta \gamma & \gamma & 0 & 0 \\
0 & 0 & 1 & 0 \\
0 & 0 & 0 & 1
\end{array} \right)
[/tex]
[tex]
b^0 = -b_0
[/tex]

3. The Attempt at a Solution

So the only way I could get it to be invariant was performing a lorentz transformation on each [tex]a^{\mu}b_{\mu}[/tex], squaring each, and then setting the first term negative and adding them. I don't know why this works, however.:

[tex]
\Lambda_{\nu}^{\mu}a^{\mu}b_{\mu} = \left(
\begin{array}{cccc}
\gamma & -\beta \gamma & 0 & 0 \\
-\beta \gamma & \gamma & 0 & 0 \\
0 & 0 & 1 & 0 \\
0 & 0 & 0 & 1
\end{array} \right) \left(
\begin{array}{c}
a^0b_0 \\
a^1b_1 \\
a^2b_2 \\
a^3b_3
\end{array} \right) = \left(
\begin{array}{c}
\gamma (a^0b_0 - \beta a^1b_1) \\
\gamma (a^1_b1 - \beta a^0b_0) \\
a^2b_2 \\
a^3b_3
\end{array} \right)
[/tex]
Squaring each term, negating first term squared, and summing:
[tex]
\gamma^2(-(a^0b_0)^2-\beta^2 (a^1b_1)^2 + 2\beta a^1b_1a^0b_0 + (a^1b_1)^2+\beta^2 (a^0b_0)^2 - 2\beta a^1b_1a^0b_0) + (a^2b_2)^2 + (a^3b_3)^2
[/tex]
Doubled terms cancel, and the others can be grouped so that they are multiplied by [tex]1-\beta^2[/tex] which cancels the [tex]\gamma^2[/tex], and you are left with the original equation again, albeit with terms squared.

Any help would be great!
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
[tex] \Lambda^{\mu}{}_{\nu}a^{\mu}b_{\mu} [/tex] is non-sense, be careful on how many times each index shows up!

And [tex] a^{\mu}b_{\mu} \equiv -a^0b^0 + \vec{a} \cdot \vec{b} [/tex] is not a vector as you have indictated.

The quantity is a scalar, you have no index which you can boost, hence it is a invarnant under lorentz transformation
 
Thanks for the reply.

Ok so I guess it should have been [tex]a'^{\mu}b'_{\mu} = \Lambda^{\mu}_{\nu}a^{\nu}b_{\nu}[/tex]. I'm aware that the initial equation is a scalar, I was merely turning it into a vector so I could easily transform it. I then squared each component, negated the first and summed them. I guess I don't really understand what you mean by boosting an index.
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
but now you are transforming a, not (ab)...

turning it into a vector? how can you turn a scalar into one vector?

What you can do I think, if not the argument that it is a scalar and has no free index which a LT can contract, then you can combine a transformed 'a' with a transformed 'b'.

I mean you are show that a Lorentz scalar is invariant under a lorentz transformation, there is nothing to be done, who gave you this 'silly' task?
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
what they did is

[tex] (dS')^2 = x^{\mu '} x_{\mu '} = x^{\mu }x_{\nu}\Lambda ^{\mu '}{}_\mu \Lambda ^{\nu }{}_{\mu '} = x^{\mu }x_{\nu}\delta ^\nu {}_\mu = x^{\mu } x_{\mu } = (dS)^2[/tex]

Since we have the following identity:
[tex] \Lambda ^{\mu}{}_\sigma\Lambda ^{\sigma }{}_{\nu} = \delta ^\nu {}_\mu [/tex]

now try same with a and b
 
Last edited:

Ben Niehoff

Science Advisor
Gold Member
1,864
159
The problem is asking you to transform each vector individually, and then show that the dot product is still the same.

The page you linked to shows the longest, most drawn-out way to do this problem. Try doing it in matrix notation. Remember that

[tex]a_{\mu}b^{\mu} = a^Tb[/tex]

and

[tex](AB)^T = B^TA^T[/tex]
 
Thanks for all the help, but I'm still confused.

I don't see how I can assume that [tex]\Lambda^{\mu}{}_{\nu}\Lambda^{\sigma}{}_{\mu} = \delta^{\sigma}{}_{\nu}[/tex]. I could see how I can assume this if what I'm trying to prove is true, but I can't assume this if I'm trying to prove it. How am I supposed to transform a covariant 4-vector? If I transform the contra-variant and covariant vectors with a lorentz transformation and dot them I don't get back the original equation. I did this out explicitly with matrix multiplication since I'm not too comfortable with the notation just yet.
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
[tex]
\Lambda^{\mu}{}_{\nu}\Lambda^{\sigma}{}_{\mu} = \delta^{\sigma}{}_{\nu}
[/tex]

Should be done in your course, otherwise it is not a good course in SR.

Consider a LT which takes x to x' followed by the inverse transformation: the LT which takes x' back to x.

First LT:

[tex] x^{\mu'} = \Lambda ^{\mu'} {}_\mu x^{\mu} = ( \mu \text{ is dummy index }) = \Lambda ^{\mu'} {}_\nu x^{\nu} [/tex]

The inverse LT:

[tex] x^{\mu} = \Lambda ^{\mu}{} _{\mu '} x^{\mu '} \Rightarrow [/tex]
[tex] x^{\mu} = \Lambda ^{\mu}{} _{\mu '} \Lambda ^{\mu'} {}_\nu x^{\nu} = \delta ^\mu {}_\nu x^\nu [/tex]


Now I was perhaps TOO nice

The covariant vector transforms as the inverse of a contravariant:
[tex] v_{\alpha '} = \Lambda ^{\alpha }{}_{\alpha '} v_{\alpha } [/tex]

Are you telling me that you are learning SR in a course where this has not been introduced? =/

All one needs to do is keeping track of the indices, one up and one down contracts, and so on.
 
Last edited:

Ben Niehoff

Science Advisor
Gold Member
1,864
159
Thanks for all the help, but I'm still confused.

I don't see how I can assume that [tex]\Lambda^{\mu}{}_{\nu}\Lambda^{\sigma}{}_{\mu} = \delta^{\sigma}{}_{\nu}[/tex]. I could see how I can assume this if what I'm trying to prove is true, but I can't assume this if I'm trying to prove it. How am I supposed to transform a covariant 4-vector? If I transform the contra-variant and covariant vectors with a lorentz transformation and dot them I don't get back the original equation. I did this out explicitly with matrix multiplication since I'm not too comfortable with the notation just yet.
To do it algebraically, you need to remember that

[tex]\gamma = \frac{1}{\sqrt{1-\beta^2}}[/tex]

You will probably get expressions that look different, but will turn out to be equal using the above.

To transform a covariant 4-vector, you can always use the metric like so:

[tex]v_{\mu} = \eta_{\mu\nu} v^{\nu}[/tex]

[tex]v'_{\mu} = \eta_{\mu\nu} v'^{\nu} = \eta_{\mu\nu} \Lambda^{\nu}_{\sigma} v^{\sigma} = \eta_{\mu\nu} \Lambda^{\nu}_{\sigma} \eta^{\sigma\lambda} v_{\lambda}[/tex]

Then what remains is to find

[tex]\eta_{\mu\nu} \Lambda^{\nu}_{\sigma} \eta^{\sigma\lambda}[/tex]

which shouldn't be hard.
 
Last edited:
Ok I think I've got it! My professor didn't explain it quite like that, the way you put it makes much more sense.
So this is what I've got:

[tex]
a'^{\mu} = \Lambda^{\mu}{}_{\nu}a^{\nu} \text{ and } b'_{\mu} = \Lambda^{\mu}_{\nu}b_{\nu}
[/tex]
[tex]
a'^{\mu}b'_{\mu} = \Lambda^{\mu}{}_{\nu}\Lambda^{\nu}_{\sigma}a^{\nu}b_{\sigma} = \delta^{\mu}{}_{\sigma}a^{\nu}b_{\sigma} = a^{\mu}b_{\mu}
[/tex]
One question: the second part of the question asks to show how the first part implies that the covariant transforms as the inverse. So it seems I'm assuming this for the first part. Is there any way I could avoid this?
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
ok, I'll give you a hint:

Start with
[tex] a^\mu b_\mu = a^{\mu '} b_{\mu '}[/tex] Since that is what you want to proove.

Now you know that [tex]
\Lambda ^{\mu}{}_{\mu '}\Lambda ^{\mu ' }{}_{\nu} = \delta ^\nu {}_\mu
[/tex]

So just consider the contravariant vector a, insert this delta - and then you'll find that ab is invariant only if the covariant vector b transforms according to the inverse LT.
 
Last edited:

malawi_glenn

Science Advisor
Homework Helper
4,782
22
[tex]
a'^{\mu}b'_{\mu} = \Lambda^{\mu}{}_{\nu}\Lambda^{\nu}_{\sigma}a^{\nu}b_{\sigma} = \delta^{\mu}{}_{\sigma}a^{\nu}b_{\sigma} = a^{\mu}b_{\mu}
[/tex]
One question: the second part of the question asks to show how the first part implies that the covariant transforms as the inverse. So it seems I'm assuming this for the first part. Is there any way I could avoid this?

That is wrong, you have 3 \nu, which is complete non-sense. and also 2 sigma as contravariant index... be carefull!
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
You will probably love this text:

http://www.teorfys.uu.se/people/minahan/Courses/SR/tensors.pdf [Broken]
 
Last edited by a moderator:
Thanks for that last link, it has cleared up a whole mess of confusion. I think I finally have it this time:
[tex]
a^{\mu '} = \Lambda^{\mu '}{}_{\mu}a^{\mu}
[/tex]
[tex]
b_{\mu '} = \Lambda^{\mu}{}_{\mu '}b_{\mu}
[/tex]
Since it's covariant, use inverse matrix.

Then:
[tex]
a^{\mu '}b_{\mu '} = \Lambda^{\mu '}{}_{\mu}a^{\mu}\Lambda^{\nu}{}_{\mu '}b_{\nu} = \delta^{\mu}{}_{\nu}a^{\mu}b_{\nu} = a^{\mu}b_{\mu}
[/tex]

Thanks so much for your help.
 

malawi_glenn

Science Advisor
Homework Helper
4,782
22
Now it looks much better! Good job! I hope you understood that you can start with the delta and than show that only if covariant b transforms as inverse LT the quantity ab is invariant under LT.
 

Related Threads for: Lorentz Invariance

  • Last Post
Replies
14
Views
4K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
1
Views
645
  • Last Post
2
Replies
25
Views
2K
Replies
2
Views
384
Replies
2
Views
1K
  • Last Post
Replies
2
Views
2K
Top