- #1
PaulMakesThings
- 2
- 0
This just out of curiosity because the system works either way, I just wonder why.
I have a high precision GPS system, there are two antennas placed about 2 meters apart with cables that are about 8 meters long going back to an embedded computer. Besides giving you the position, it uses the relative positions to give you heading.
Now as I understand it, GPS uses the time of flight of the satellite signals to to trilaterate your position. But wouldn't the time difference between the two antennas be thrown off by the time it takes the signal to travel through that 8 m cable? Does it use the timestamp in the signals to somehow compare them without using it's own reference time? Because presumably they hit the antenna with the correct timing. Does the extra time added by the cable simply cancel out, or is it a more active process than that?
Just wondering how this works.
Thanks
I have a high precision GPS system, there are two antennas placed about 2 meters apart with cables that are about 8 meters long going back to an embedded computer. Besides giving you the position, it uses the relative positions to give you heading.
Now as I understand it, GPS uses the time of flight of the satellite signals to to trilaterate your position. But wouldn't the time difference between the two antennas be thrown off by the time it takes the signal to travel through that 8 m cable? Does it use the timestamp in the signals to somehow compare them without using it's own reference time? Because presumably they hit the antenna with the correct timing. Does the extra time added by the cable simply cancel out, or is it a more active process than that?
Just wondering how this works.
Thanks