How does Radar work?

  • Thread starter OSalcido
  • Start date
  • Tags
    Radar Work
Radar can measure distances up to a certain point, beyond which it becomes out of range. This distance is determined by the time it takes the signal to travel to and from the target.f
  • #1
66
0
Hey

I've been wondering for a while about radar. I believe it works by bouncing a radio signal off an object and receiving the reflected radiowave at a receiver, right?

My question is.. since Radar works at (i believe) short distances measured in miles... wouldn't the time it took the radar signal to go from transmitter to reflecting object back to receiver be infinitely small? The signal travels at the speed of light doesn't it?

How are we able to measure this tiny timespan and use it to measure distance against the known speed of light? How were they able to achieve this without the use of computers in the 40's when radar was invented?

Thanks
 
  • #2
A radar emits a single short pulse of energy. Then listens for a relatively long period of time. The initial pulse duration may be on the order of a few microseconds. IIRC about 300 milliseconds corresponds to a 300 mile range.

Yes, the transmitted pulse travels at the speed of light. The range to a object is found by computing the distance light can travel in half the time required for the signal to return to the receiver. This is due to the fact that the transmitted pulse travels to the object then the reflected signal must return to the receiver.

The direction the radar antenna is pointing gives the direction to the object, the time for the signal to return gives the range.

This information is displayed on a device called a repeater, this is the TV like thing with the rotating line (called a sweep). The returned signals are displayed as bright spots on the sweep. If an object is outside of the range (as determined by the wait time) of the radar, it will show up on the repeater as out of sync noise. This can all be done with analog circuitry, computers are not necessary.
 
Last edited:
  • #3
To answer the main point of confusion, though, no, it isn't an "infinitessimally small" amount of time - at least not anymore. Our ability to measure such small intervals has gotten pretty good.

Consider this: a decent computer these days completes 3 billion calculations per second. In that time, light travles only 10 cm. It would be relatively trivial to have such a machine detect objects as close as a few meters. Of course, most radars are analog, but the concept is the same. See the wik link: http://en.wikipedia.org/wiki/Radar
In most cases, the receiver does not detect the return while the signal as it is being sent out. Through the use of a device called a diplexer, the radar switches between transmit and receive at a predetermined rate. The minimum range is calculated by measuring the length of the pulse multiplied by the speed of light, divided by two. In order to detect closer targets one must use a shorter pulse length.

A similar effect imposes a specific maximum range as well. If the return from the target comes in when the next pulse is being sent out, once again the receiver cannot tell the difference. In order to maximize range, one wants to use longer times between pulses, the inter-pulse time.
[basically what Integral just said]
 

Suggested for: How does Radar work?

Back
Top