How does Radar work?

  • Thread starter OSalcido
  • Start date
  • #1
66
0

Main Question or Discussion Point

Hey

I've been wondering for a while about radar. I believe it works by bouncing a radio signal off an object and receiving the reflected radiowave at a receiver, right?

My question is.. since Radar works at (i believe) short distances measured in miles... wouldn't the time it took the radar signal to go from transmitter to reflecting object back to receiver be infinitely small? The signal travels at the speed of light doesnt it?

How are we able to measure this tiny timespan and use it to measure distance against the known speed of light? How were they able to achieve this without the use of computers in the 40's when radar was invented?

Thanks
 

Answers and Replies

  • #2
Integral
Staff Emeritus
Science Advisor
Gold Member
7,198
55
A radar emits a single short pulse of energy. Then listens for a relatively long period of time. The initial pulse duration may be on the order of a few microseconds. IIRC about 300 milliseconds corresponds to a 300 mile range.

Yes, the transmitted pulse travels at the speed of light. The range to a object is found by computing the distance light can travel in half the time required for the signal to return to the receiver. This is due to the fact that the transmitted pulse travels to the object then the reflected signal must return to the receiver.

The direction the radar antenna is pointing gives the direction to the object, the time for the signal to return gives the range.

This information is displayed on a device called a repeater, this is the TV like thing with the rotating line (called a sweep). The returned signals are displayed as bright spots on the sweep. If an object is outside of the range (as determined by the wait time) of the radar, it will show up on the repeater as out of sync noise. This can all be done with analog circuitry, computers are not necessary.
 
Last edited:
  • #3
russ_watters
Mentor
19,425
5,601
To answer the main point of confusion, though, no, it isn't an "infinitessimally small" amount of time - at least not anymore. Our ability to measure such small intervals has gotten pretty good.

Consider this: a decent computer these days completes 3 billion calculations per second. In that time, light travles only 10 cm. It would be relatively trivial to have such a machine detect objects as close as a few meters. Of course, most radars are analog, but the concept is the same. See the wik link: http://en.wikipedia.org/wiki/Radar
In most cases, the receiver does not detect the return while the signal as it is being sent out. Through the use of a device called a diplexer, the radar switches between transmit and receive at a predetermined rate. The minimum range is calculated by measuring the length of the pulse multiplied by the speed of light, divided by two. In order to detect closer targets one must use a shorter pulse length.

A similar effect imposes a specific maximum range as well. If the return from the target comes in when the next pulse is being sent out, once again the receiver cannot tell the difference. In order to maximize range, one wants to use longer times between pulses, the inter-pulse time.
[basically what Integral just said]
 

Related Threads on How does Radar work?

  • Last Post
Replies
8
Views
10K
Replies
4
Views
740
Replies
1
Views
2K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
6
Views
6K
  • Last Post
Replies
4
Views
983
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
15
Views
8K
  • Last Post
2
Replies
38
Views
14K
Top