Discovering the Wonders of Radar: A Closer Look at How it Works

  • Thread starter Thread starter OSalcido
  • Start date Start date
  • Tags Tags
    Radar Work
Click For Summary
SUMMARY

Radar technology operates by emitting a short pulse of radio energy, which reflects off objects and returns to a receiver. The time taken for the signal to travel to the object and back is measured to calculate distance, utilizing the known speed of light. Early radar systems, developed in the 1940s, relied on analog circuitry and devices like diplexers to switch between transmitting and receiving signals. Modern advancements allow for precise measurements of very short time intervals, enhancing radar's effectiveness in detecting objects at varying distances.

PREREQUISITES
  • Understanding of radar principles and signal processing
  • Familiarity with the speed of light and its implications in distance measurement
  • Knowledge of analog circuitry and its applications in radar systems
  • Basic concepts of pulse modulation and timing in signal transmission
NEXT STEPS
  • Research the principles of radar signal processing and pulse compression techniques
  • Learn about the functionality and design of diplexers in radar systems
  • Explore the evolution of radar technology from analog to digital systems
  • Investigate modern radar applications in various fields, including aviation and automotive safety
USEFUL FOR

Engineers, radar technicians, and students in electronics or telecommunications who seek to understand radar technology and its applications in real-world scenarios.

OSalcido
Messages
66
Reaction score
0
Hey

I've been wondering for a while about radar. I believe it works by bouncing a radio signal off an object and receiving the reflected radiowave at a receiver, right?

My question is.. since Radar works at (i believe) short distances measured in miles... wouldn't the time it took the radar signal to go from transmitter to reflecting object back to receiver be infinitely small? The signal travels at the speed of light doesn't it?

How are we able to measure this tiny timespan and use it to measure distance against the known speed of light? How were they able to achieve this without the use of computers in the 40's when radar was invented?

Thanks
 
Engineering news on Phys.org
A radar emits a single short pulse of energy. Then listens for a relatively long period of time. The initial pulse duration may be on the order of a few microseconds. IIRC about 300 milliseconds corresponds to a 300 mile range.

Yes, the transmitted pulse travels at the speed of light. The range to a object is found by computing the distance light can travel in half the time required for the signal to return to the receiver. This is due to the fact that the transmitted pulse travels to the object then the reflected signal must return to the receiver.

The direction the radar antenna is pointing gives the direction to the object, the time for the signal to return gives the range.

This information is displayed on a device called a repeater, this is the TV like thing with the rotating line (called a sweep). The returned signals are displayed as bright spots on the sweep. If an object is outside of the range (as determined by the wait time) of the radar, it will show up on the repeater as out of sync noise. This can all be done with analog circuitry, computers are not necessary.
 
Last edited:
To answer the main point of confusion, though, no, it isn't an "infinitessimally small" amount of time - at least not anymore. Our ability to measure such small intervals has gotten pretty good.

Consider this: a decent computer these days completes 3 billion calculations per second. In that time, light travles only 10 cm. It would be relatively trivial to have such a machine detect objects as close as a few meters. Of course, most radars are analog, but the concept is the same. See the wik link: http://en.wikipedia.org/wiki/Radar
In most cases, the receiver does not detect the return while the signal as it is being sent out. Through the use of a device called a diplexer, the radar switches between transmit and receive at a predetermined rate. The minimum range is calculated by measuring the length of the pulse multiplied by the speed of light, divided by two. In order to detect closer targets one must use a shorter pulse length.

A similar effect imposes a specific maximum range as well. If the return from the target comes in when the next pulse is being sent out, once again the receiver cannot tell the difference. In order to maximize range, one wants to use longer times between pulses, the inter-pulse time.
[basically what Integral just said]
 
What mathematics software should engineering students use? Is it correct that much of the engineering industry relies on MATLAB, making it the tool many graduates will encounter in professional settings? How does SageMath compare? It is a free package that supports both numerical and symbolic computation and can be installed on various platforms. Could it become more widely used because it is freely available? I am an academic who has taught engineering mathematics, and taught the...

Similar threads

Replies
24
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
721
  • · Replies 23 ·
Replies
23
Views
7K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
Replies
18
Views
8K
  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 4 ·
Replies
4
Views
5K