- 29

- 2

Not sure if this should be in the homework section or not but in any case...

I'm having difficulty understanding the outputs from the Lorentz transform.

Example problem.

The earth and sun are 8.3 light-minutes apart. Ignore their relative motion for this problem and assume they live in a single inertial frame, the Earth-Sun frame. Events A and B occur at t = 0 on the earth and at 2 minutes on the sun respectively. Find the time difference between the events according to an observer moving at u = 0.8c from Earth to Sun. Repeat if observer is moving in the opposite direction at u = 0.8c.

So plunking x = 498 lightseconds, v = .8c and t = 120 the answer of ∆t(observer) = -464s or -7.7mins pops out so I've got the math right. But, I don't have an intuitive sense of what that "answer" means. Is it the amount of time shown on a clock on board the ship since the ship launch at t=0, or...?

I broke down the equation into the gamma portion and the... other factor (not sure what to call it) and they are 1.666 and -278.4s respectively. The -278.4s seems to represent the meeting point between the ship and the light sphere from the event from the stationary frame but I'm not positive. I can accept the 1.66 being the stretch/skew of the axes of the worldline from t to t' assuming that is correct as well.

Any help is appreciated.

Tom.