As far as I understand it... Each satellite transmits a signal that the receiver receives. This signal consists of (at least) a satellite identification together with a time-stamp of when the signal was sent and its position at that time. The receiver calculates the distance from the satellite by comparison of the time-stamp to its own internal clock (distance = speed-of-light x time-difference). With three different satellite signals the intersection of three spheres can be found and therefore the position of the receiver. So, the accuracy of the receiver's clock against the satellite's clocks would seem to be critical - in fact fundamental to the accuracy of the distance measurement. But if the receiver does not know exactly how far away the satellite is, how could it accurately compensate for the delay in receiving a time synchronisation signal? Assuming that the receiver does not have an on-board atomic clock, any clock it does have will have to be sync'ed probably very regularly. So the question is, how is the receiver's clock sync'ed to the GPS satellite's clocks, given that the receiver does not know far away the satellite is, so it can correct for the time-delay? Sounds like a chicken and egg problem to me!