How does GPS receiver calculate time delay?

AI Thread Summary
GPS receivers calculate time delay by using signals from multiple satellites, each equipped with atomic clocks. When a signal is received, it includes the satellite's time and position, allowing the receiver to determine its own location. The receiver compares its internal clock to the time encoded in the signals to compute the distance to each satellite. By receiving signals from at least four satellites, the receiver can correct for any discrepancies in its clock, aligning it with the satellites' time. This process enables the receiver to accurately determine both its position and the current time.
aadarsh9
Messages
14
Reaction score
0
The GPS satellite has an atomic clock which sychronizes with the receiver. But when the receiver gets the time, it is delayed as it travelled. So when its 08h 00min 05s in the satellite, the receiver gets maybe 08h 00min 03s. The time of travel of the signal is 2s but how will the receiver calculate this time difference without knowing the real time?
 
Physics news on Phys.org
1st: the travel time is much less than 2s.
2nd: The receive can figure out where it is (that's the whole point of the GPS system isn't it?) and can correct the delay
 
Well to figure out where you are requires the time of travel of the signal first. If the gps receive worked 2 way, I understand how it could work: the satellite bounce off the signal of the receiver like how a sonar works to calculate the time difference. But the gps is only a receiver and I can't figure out how it can find the travel time of the signal. The only thing I can say is that the one who invented it is a genius. Anyone can help me understand it please?
 
The GPS satellites have their own clocks that they reference when they send a signal. This signal includes the time it was sent and the position of the satellite. The receiver uses signals from multiple satellites to calculate where it's at. The receiver does not communicate back to the satellite.

See here: http://www.physics.org/article-questions.asp?id=55
 
@Drakkith
I knew that but how does it calculate the time of travel of the signal if the gps receiver only receives signals?
 
. In a nutshell, the
receiver looks at incoming signals from four or
more satellites and gauges its own inaccuracy. In
other words, there is only one value for the
"current time" that the receiver can use. The
correct time value will cause all of the signals
that the receiver is receiving to align at a single
point in space. That time value is the time value
held by the atomic clocks in all of the satellites.
So the receiver sets its clock to that time value,
and it then has the same time value that all the
atomic clocks in all of the satellites have.

Still didn't grab the concept... I read much stuff on the internet and they are either too complicated or too childish!
 
Going over the article as a whole may help, especially the 2nd page where it explains trilateration. Read through the whole article a few times and if you still have problems come back.
 
The delay cannot be calculated if the satellite is not perfectly synchronised with the receiver. The article states that both starts transmitting PRN at midnight but their time needs to be synchronised first. The time is sent via radio waves and so there will be a delay and it will not be synchronised. Actually it will never be synchtonised because on any information it sends, there will be a delay.
 
  • #10
That description is incorrect. The "long, digital pattern" mentioned there is the P/Y signal used only by the military GPS receivers; this sequence is secret and cannot be used by civilian receivers. There is also the C/A signal that uses a very short digital sequence, which repeats every millisecond. It is not secret and it is used by civilian receivers. Because it is so short, a receiver can "lock into" it fairly quickly. As soon as the lock-in is achieved, the receiver can read the navigation message from the signal. The message is composed of a number of frames, and each frame contains the satellite's local time at the time the frame is sent. The difference between the receiver's local time and the satellite's local time is the delay.

This delay, however, has an indefinite error, which includes an error in the local time. The receiver obtains the position and time signals from multiple satellites and its solves a system of equations, which then reveals the local time error and so the local clock can be adjusted. Then the process is repeated many many times, which minimizes the error in time (and position).
 
  • Like
Likes Arlen
  • #11
The receiver compares the time encoded in the satellites signal to it's own internal clock to compute a distance to the satellite.

If the receiver gets a signal from 1 satellite it can determine that it is somewhere on the surface of a sphere of a given radius. It also know that this measurement comes with an error of unknown magnitude so it's pretty useless

If the receiver gets 2 signals it can compute 2 spheres which will intersect on a circle. It now knows it is somewhere on that circle but still knows nothing about the error of the measurement.

When 3 signals are present the location can be narrowed to a pair of points. One of these 2 points is usually not near the surface of the Earth and so is discarded as not plausible. If the receiver had a perfect clock that would be all that was necessary.

A 4'th signal will disagree with the other 3 unless the receivers clock and the satellites clocks are perfectly synchronized. The receiver can now adjust it's own clock until all 4 spheres intersect at 1 point. This not only gives the receiver it's exact location, but also the exact time.

When 5 or more signals are available the receiver can compute its location based on every possible combination of 4 of them. The variance between computed locations gives an indication of the accuracy of the measurement.
 

Similar threads

Replies
8
Views
1K
Replies
103
Views
5K
Replies
15
Views
4K
Replies
1
Views
2K
Replies
5
Views
3K
Replies
31
Views
6K
Back
Top