- #1
psmitty
- 16
- 0
If we have two clocks, one stationary at surface of Earth, and the other one very slowly moved at the surface of Earth, in the direction of Earth rotation, the clock that made the trip should be running ahead of the same stationary clock once it makes a complete circle and stops where it started from (if I got it right).
In an inertial frame, they should remain synchronized at low relative speeds, but on Earth, being a non-inertial frame, no matter how slowly you move a clock, it goes out of sync when you move it.
Also, if you move it half way in one direction, and return it back, it will get back to sync with the stationary one.
Should this be considered as time dilation, some sort of a or change in simultaneity? I mean, for a slowly moving clock, this difference in time seems to be a function of Earth's rotation speed and clock's traveled distance along the surface.
So how do I get this function for time difference?
In an inertial frame, they should remain synchronized at low relative speeds, but on Earth, being a non-inertial frame, no matter how slowly you move a clock, it goes out of sync when you move it.
Also, if you move it half way in one direction, and return it back, it will get back to sync with the stationary one.
Should this be considered as time dilation, some sort of a or change in simultaneity? I mean, for a slowly moving clock, this difference in time seems to be a function of Earth's rotation speed and clock's traveled distance along the surface.
So how do I get this function for time difference?