- #1
Markus Kahn
- 112
- 14
- Homework Statement
- The Global Positioning System (GPS) consists of at least 24 satellites that are orbiting the Earth at
a distance of ##h = 26'600## km from its center with a velocity of ##v\approx 3.9## km/s. All the satellites carry atomic clocks which are synchronized such that they all show the same (GPS) time. At certain time intervals, the satellites simultaneously emit a signal which carries their orbital data and the time ##t_e## when the signal was emitted.
Due to relativistic effects, the satellite clocks will run with a different speed than clocks on Earth. Assuming ##v^2,\phi << c^2 ## we have ##g_{00}=1+\frac{2 \phi}{c^{2}}, g_{i j}=-\delta_{i j},## up to ##\mathcal{O}(\frac{1}{c^{3}})## in the Newtonian limit. Thinking as an observer far away from Earth (neglecting Earth's motion), compute the relation between infinitesimal elements ##d\tau_E## of Earth coordinate time and ##d\tau_S## of satellite coordinate time. By expansion up to ##\mathcal{O}(\frac{1}{c^{3}})##, find by how much a satellite clock runs faster/slower than a clock on Earth? What is the absolute error after one day? Compare SR and GR effects.
- Relevant Equations
- Nothing given..
I'm a bit lost at how to exactly start this exercise... As far as I understand we need to first determine ##d\tau_E## and ##d\tau_S##.
First question: Since we can neglect the Earth's movement, can I also neglect the movement of the satellite with respect to the far away observer? If so, I don't really get this exercise, since then the signal will not be delayed, just redshifted by the gravitational field.
If I can't neglect the movement of the satellite I still think the following should hold:
$$d\tau_E = \sqrt{g_{00}(\vec{r}_E)}dt_E\quad \text{and}\quad d\tau_S = \sqrt{g_{00}(\vec{r}_S)}dt_S,$$
which means we express the proper time in terms of the times in the rest frame of Earth and satellite. The issue is, that I don't really know where to take it from here.. Is this even the right idea, or do I need to start somewhere completely different?
First question: Since we can neglect the Earth's movement, can I also neglect the movement of the satellite with respect to the far away observer? If so, I don't really get this exercise, since then the signal will not be delayed, just redshifted by the gravitational field.
If I can't neglect the movement of the satellite I still think the following should hold:
$$d\tau_E = \sqrt{g_{00}(\vec{r}_E)}dt_E\quad \text{and}\quad d\tau_S = \sqrt{g_{00}(\vec{r}_S)}dt_S,$$
which means we express the proper time in terms of the times in the rest frame of Earth and satellite. The issue is, that I don't really know where to take it from here.. Is this even the right idea, or do I need to start somewhere completely different?