- #1

Stoonroon

- 11

- 0

Consider an experiment such as that of Vessot-Levine, but suppose the clock is simply dropped from rest at "apogee" over a large non-rotating planet with no atmosphere. Suppose the time on the clock is set to zero at the moment of release. At rigidly built observing stations alongside the path and at the surface the elapsed time on the falling clock is recorded. The clock's frequency is affected by its speed and its location in the gravitational field. (The frequency of the Vessot-Levine clock was monitored from a distance using a clever first-order Doppler canceling system. But they did not measure elapsed proper time.) My question is, how would these idealized proper time observations differ from the predictions of Newton's theory, in which all clocks always tick at the same rate? Or is it that the combined effects of acquired falling speed and spacetime curvature (i.e, the difference between proper and coordinate distance) "conspire" to make the Newtonian equation match the General Relativistic prediction?