zacl79
- 23
- 0
A 'cause' occurs at point 1 (x1, t1) and its 'effect' occurs at point 2 (x2, t2) as measured by observer O. Use Lorentz transformation to find t'2 - t'1 as measured by O' and show that t'2 - t'1 >= 0. that is Observer O' can never see the effect before the cause.
I know that is possible to prove this, but just having some difficulty in doing so.
I use:
t'1 = (gamma)(t1 - ux1 /c^2) => goes to zero as X1 and t1 are 0??
t'2 = (gamma)(t2 - ux2 / c^2)
Working through this i get:
t'2 - t'1 = gamma(t2 - ux2 /c^2)
Now i don't think that this is the correct proof that i require.
Any help to where i have gone wrong, or if i am overlooking something would be greatly appreciated.
Thanks
I know that is possible to prove this, but just having some difficulty in doing so.
I use:
t'1 = (gamma)(t1 - ux1 /c^2) => goes to zero as X1 and t1 are 0??
t'2 = (gamma)(t2 - ux2 / c^2)
Working through this i get:
t'2 - t'1 = gamma(t2 - ux2 /c^2)
Now i don't think that this is the correct proof that i require.
Any help to where i have gone wrong, or if i am overlooking something would be greatly appreciated.
Thanks