- #1
cbd1
- 123
- 1
I have a simple question. If we put a clock in a satellite in orbit for some amount of time, and then brought it back to earth, will the clock's reading of time passed be less than a clock on Earth's measured passed time. *(This is strictly ignoring any general relativistic time dilation!)
In other words, I know that it appears that the clock on the moving satellite is ticking less frequently due to special relativistic effects, while viewing from earth, but is that clock actually ticking more slowly? That is, will less time have passed for the clock that was in orbit (after correcting for GR time dilation) once it is brought back to earth?
[What I am really asking, in regards to theory, from the viewpoint of the satellite, it would appear that the clock on Earth is the one in motion. In that respect, an observer on the satellite would expect less time to have passed on the Earth clock when returning. Both cannot be true! So, it must be merely a perceived effect, with neither clock actually ticking more slowly than the other (due to relative velocity).]
In other words, I know that it appears that the clock on the moving satellite is ticking less frequently due to special relativistic effects, while viewing from earth, but is that clock actually ticking more slowly? That is, will less time have passed for the clock that was in orbit (after correcting for GR time dilation) once it is brought back to earth?
[What I am really asking, in regards to theory, from the viewpoint of the satellite, it would appear that the clock on Earth is the one in motion. In that respect, an observer on the satellite would expect less time to have passed on the Earth clock when returning. Both cannot be true! So, it must be merely a perceived effect, with neither clock actually ticking more slowly than the other (due to relative velocity).]