- #1

keeperofthekeys

- 1

- 0

In 1971 four portable atomic clocks were flown around the world in jet aircraft, two east bound and two westbound, to test the times dilation predictions of relativity.

*a)*If the westbound plane flew at an average speed of 1500 km/h relative to the surface, how long would it have to fly for the clock on board to lose 1s relative to the reference clock on the ground?

*b)*In the actual experiment the plane circumflew Earth once and the observed discrepancy of the clocks was 273ns. What was the plane's average speed?

For the first part, I convereted 1500km/h to 416.6m/s, then put it terms of c, or 1.389x10^-6c. I then took the standard equation t'/(sqrt(1-(1.389x10^-6c)^2)) = t.

I then made t-t'=1, solved for t'=-1+t and put that into the equation.

Solving, I got 1x10^12 s, or 31,688 years. Is this a reasonable answer?

I think my method would be wrong, because I followed similar steps to get

*b*. After removing the t' prime from the equation as I did in part

*a*, I set t = 40075160m/v, 40075160m being the circumference of the earth. My equation was as follows:

(1/(sqrt(1-(v^2/c^2)))*(-273x10^-9s + 40075160/v) = 40075160/v

I solved for v, but got 1226.2 m/s, which is a speed I don't believe we've even held for a sustained flight now, let alone in 1971. Where am I going wrong?