Homework Help: Special relativity, delay of a clock in a plane

1. Aug 15, 2010

fluidistic

1. The problem statement, all variables and given/known data
A plane is moving at 600m/s with respect to the ground. According to clocks on the ground, how much time would it take so that the plane's clock is delayed by 2 microseconds?

2. Relevant equations
Lorentz transformations.

3. The attempt at a solution
Let O be a reference frame on the ground and O' be a reference frame on the plane.
v=600m/s. If I'm not wrong, they ask me $$t_B-t_A$$ such that $$(t_B-t_A)-(t_A'-t_B')=2 \times 10 ^{-6}s$$. (*)
What I've done so far is $$t_B'-t_A'=\gamma \left [ t_B-t_A +\frac{v}{c^2}(x_A-x_B) \right ]$$, replacing $$x_A-x_B$$ by $$v(t_A-t_B)$$, then solving for $$t_B-t_A$$ in (*), I reach that it's worth exactly $$1000000s$$. Or 11 days, 13 hours, 46 minutes and 40 s. It seems too big for me. Do you get a different answer?

2. Aug 16, 2010

collinsmark

3. Aug 16, 2010

fluidistic

Oh ok. Thanks a lot for the confirmation.

4. Aug 16, 2010

collinsmark

Be careful of your precision though. The speed of light isn't exactly 3.000000 x 108 m/s. So I don't think you should be calculating the time down to the very second with that. But something around 1.0 x 106 seconds is the answer that I got, is what I meant.