- #1

DanMP

- 179

- 6

- TL;DR Summary
- In this thread I want to learn/discuss how time dilation in a planet-moon system is calculated and how it can be tested with accuracy.

In Wikipedia time dilation is considered:

As far as I know, in a planet-moon system, the difference in elapsed time between a clock on the planet and a clock on its moon is calculated using GR/proper time, so it's not very obvious if/how the movement/velocity of the moon around the planet would influence the total time difference between the two clocks. Due to the differences in gravitational potential, the clock on the moon should be faster than the one on the planet, but the difference in velocity may reduce a little bit the difference in elapsed time. In this thread I want to learn about the amount of this reduction, if any. One way to do this is to calculate the total difference between the clocks in the situation where the moon is not rotating (is hovering) and compare that result with the normal result, with the moon orbiting the planet. In order to simplify the calculation, we may consider the moon orbit as circular and the clocks on one of the poles. Also, in order to have some numerical values, we can consider the Earth-Moon system, but it's not necessary, because it's nothing special in this particular case.

If we want to experimentally test such a prediction, we may carry a deep space atomic clock in a man mission to the Moon and back, a space Hafele-Keating experiment. Or we may compare the clocks as we do for GPS clocks. Anyway, this experimental part was briefly discussed in another thread, so in this thread I want to learn about what exactly the theory of relativity would predict.

Time dilationis the difference in elapsed time as measured by two clocks, either due to a relative velocity between them (special relativity) or due to a difference in gravitational potential between their locations (general relativity).

As far as I know, in a planet-moon system, the difference in elapsed time between a clock on the planet and a clock on its moon is calculated using GR/proper time, so it's not very obvious if/how the movement/velocity of the moon around the planet would influence the total time difference between the two clocks. Due to the differences in gravitational potential, the clock on the moon should be faster than the one on the planet, but the difference in velocity may reduce a little bit the difference in elapsed time. In this thread I want to learn about the amount of this reduction, if any. One way to do this is to calculate the total difference between the clocks in the situation where the moon is not rotating (is hovering) and compare that result with the normal result, with the moon orbiting the planet. In order to simplify the calculation, we may consider the moon orbit as circular and the clocks on one of the poles. Also, in order to have some numerical values, we can consider the Earth-Moon system, but it's not necessary, because it's nothing special in this particular case.

If we want to experimentally test such a prediction, we may carry a deep space atomic clock in a man mission to the Moon and back, a space Hafele-Keating experiment. Or we may compare the clocks as we do for GPS clocks. Anyway, this experimental part was briefly discussed in another thread, so in this thread I want to learn about what exactly the theory of relativity would predict.