I was watching QI http://www.youtube.com/watch?v=gU7McFm_cKQ&t=7m09s and prof Brian Cox said that time runs roughly 38000 ns per day faster on GPS satellites than on the ground. From that he concluded that since light travels roughly 1 foot per nanosecond, GPS would generate a positional error 38000 feet per day, if relativity effects weren't compensated. The same conclusion is brought here: http://www.astronomy.ohio-state.edu/~pogge/Ast162/Unit5/gps.html *** This kind of reasoning seems incorrect, because GPS satellites are all located in a very similar gravitational field and the ground clock on receiver is constantly reset to follow the more accurate time signals from the satellites ( http://electronics.howstuffworks.com/gadgets/travel/gps3.htm ), so the absolute time on ground would not matter. As a result, the position error of GPS would be much lower than 38000 feet per day. What do you think?
I think I don't understand your point. As described in the articles, the relativistic effects are compensated, so the error is indeed much less. Are you trying to claim that the compensation is not necessary??
Yes, pretty much. Actually I was trying to say that the 38000 feet per day position error drift estimate, if the relativistic effects weren't compensated, is wrong. The error would never accumulate like that over time.
Yes, pretty much. Actually I was trying to say that the 38000 feet per day position error drift estimate, if the relativistic effects weren't compensated, is wrong. The error would never accumulate like that over time. But is this reasoning justified?
Let's try to be more clear. There's an oscillator of some sort (cesium, rubidium, it varies) on the satellite clocks. N cycles of this frequency are considered to be a second. The value of N on the satellite clocks has deliberately been set to an incorrect value, or rather a value that would not be correct on a ground clock, to keep the space- clocks synhchronized with accurate ground clocks. If you did not compensate the oscillators, the primary effect would be that the space-clocks would not stay synchronized with their ground counterparts. While the usual GPS receivers do not use accurate ground clocks, such accurate atomic clocks (of the same sort that are on the spacecraft) do exist. It would be possible to find a software work-around for the fact that the space-clocks were not keeping synch with the ground clocks, but the difference in synchronization would be obvious and significant, and of the magnitude quoted in the literature.
And in case anyone is interested in the gory detail of this, one can read this paper: http://www.emis.ams.org/journals/LRG/Articles/lrr-2003-1/download/lrr-2003-1BW.pdf Zz.
I think what the OP means is that a GPS receiver constantly syncs it's clock with the clock on the satellite anyway, so that the clock on the receiver would be accurate enough for any kind of useful positioning. As I understand, he is asking that considering the way the standard GPS receiver works, shouldn't there not be any build up of a positioning error even if the relativistic effects aren't accounted for? Because if the error isn't accounted for, it has very little time to rack up and since the average clock in a GPS receiver isn't very accurate anyway, would the error caused by relativity be of any significance between synchronizations?
As I understand it, the clock in the receiver doesn't need to be that good as it is basically only comparing the relative times / phases of the signals received from all the satellites it can see. It's the relative times that gives the position.
Thanks. That is precisely what I meant. It seems that the huge position error accumulation estimate of 38000 feet per day is nonsense, because it is based on the assumption that receiver uses absolute local time, thus error accumulates. Whereas in reality receiver resets it very frequently using signal from one of the satellites. Yeah, so the errors would be caused by desynchronization between satellite clocks rather than desynchronization between the satellite and ground times. Because the satellites are all located at similar distance from the center of Earth, thus similar gravitational field and moving with similar velocity, relativistic effects would be also similar on all satellites. So relativistic effects wouldn't cause satellites to desync with each other and that is all that counts on receiver position calculation.
BTW, the paper describes the relative time shift between orbital and ground based reference frames (pages 15, 16), but does not state how the GPS position error would have behaved, if clock frequency wasn't compensated according to EQ 36. It could be that the compensation was put to place to simplify syncing with the ground station and because GPS is also used for time transfer, so it wouldn't be nice if time would be running slightly faster on satellites than on the ground. But positioning could work nearly as well without the compensation. The paper doesn't state opposite.
This is puzzling. Are you saying that even if there is timing error, there would be NO positioning error? Try it. It takes something with velocity v, a time t to go travel a distance x. If all you have are v and t, and you wish to use those to find x, are you saying that an error in t will NOT produce an error in x? Again, as I've asked earlier, show your own calculation to prove your point. All you have done so far is make some vague, hand-waving argument. This will be the last time I will ask that before this thread becomes a speculative, unverified topic that is in violation of the PF Rules that you had agreed to. Zz.
[itex]x = 20200 km = 20 200 000 m[/itex] (approximate height of the orbit) [itex]c = 3 \cdot 10^{8} m/s [/itex] [itex]\Delta \tau= 38 \mu s / day = \frac{38 \cdot 10^{-6} s}{24 \cdot 60 \cdot 60 s} = 4.4 \cdot 10^{-10} s/s [/itex] (relativistic time drift on satellite compared to ground; taken from literature) [itex]t_{without \:relativistic\: effects} = x / c[/itex] [itex]t_{measured} = t_{without\: relativistic\: effects} + t_{without\: relativistic\: effects} \cdot \Delta \tau = t_{without\: relativistic\: effects} (1 + \Delta \tau)[/itex] [itex]x_{measured} = c \cdot t_{measured} = c \cdot t_{without\: relativistic\: effects} (1 + \Delta \tau) = c \cdot x / c \cdot (1 + \Delta \tau) = x \cdot (1 + \Delta \tau)[/itex] [itex]\Delta x = x_{measured}-x = x \cdot \Delta \tau = {20,2 \cdot 10 ^6 m} \cdot 4.4 \cdot 10^{-10} = 8.9 \cdot 10^{-3} m = 8,9 mm[/itex] This error of distance caused by relativistic time drift is negligible - under one cm. Even if the error would be 10 times as large, GPS would still be as usable. The nature of my argument was that there is no systematic error accumulation. Brian Cox claimed that the error accumulation is [itex]\Delta\epsilon = c \cdot \Delta \tau = 38000 feet / day[/itex], where [itex]\Delta \tau= 38 \mu s / day[/itex]. But since absolute ground time is being synced constantly (link to prove that is in first post), actually [itex]\Delta \tau= 0 \mu s / day[/itex] and [itex]\Delta\epsilon = c \cdot 0 = 0 feet / day[/itex]. BTW ZapperZ, I haven't waved my hand once, only slapped my forehead a couple of times after reading your posts in this thread.
A simple page I found talking about GPS: http://www.kowoma.de/en/gps/positioning.htm The clock on the receiver isn't accurate enough to measure the time it took to receive the signal with high accuracy anyway, it just uses relative times, hoping the clock is accurate enough that it doesn't change it's speed too much and the relative time proportions are relatively accurate. Then it calculates the position and also corrects the clock by making the spheres of the distances from the satellites intersect. So it seems that the receiver isn't just synchronizing the clock from time to time, it is actually figuring out the correct time based on the relative times of the signals for each position calculation, otherwise it would have much less accuracy. This should mean that there would be no error build-up.
Why would that be? It seems to me the poster has a point. If the clocks are frequently synchronized then, even if we were to remove the adjustment for relativistic effects, systemic clock errors would not accumulate. Do you disagree with that? Furthermore if this is actually true then it would be interesting to know if we leave out the relativistic corrections how much it would actually matter.
I'm REALLY glad you guys didn't design the GPS system since I prefer driving on the road to driving through corn fields and buildings which is what would happen if the relativistic effects were not accounted for.
Did you use smaller font for all the reasoning to back up your argument? Because I cannot see any. To prove your point, you'd have to give at least some - in particular show that the two simple calculations in post #13 are wrong, incomplete or based on the wrong assumptions. Even Einstein had to back his theories up with reason for others to accept them. (And I believe in you. You are just like Einstein.)
What they're saying is: If satellite and ground based clocks are initially synched, separated, and put into operation without correcting for relativistic effects the accuracy of the system would fail by 1 foot/ns. That's a fact. Read Ashby's paper or better yet do this project on the GPS. Student project on the Global Postioning System [Taylor and Wheeler Exploring Black Holes] http://www.eftaylor.com/download.html#general_relativity
Here's a video by the Perimeter Institute regarding GPS & GR/SR https://www.youtube.com/watch?v=zQdIjwoi-u4 Source- http://www.perimeterinstitute.ca/Perimeter_Inspirations/General/Perimeter_Inspirations/