Swamping in Satellite Communications

AI Thread Summary
Swamping in satellite communications occurs when a strong local transmitter desensitizes a receiver, preventing it from detecting weaker signals from distant sources. This phenomenon can happen even with line-of-sight propagation, as the local receiver's automatic gain control (AGC) may reduce sensitivity to the weaker signals. The discussion highlights that using separate uplink and downlink frequencies is crucial to avoid these issues and engineering complications. Additionally, modulation types affect how signals are processed, with FM receivers exhibiting a capture effect that prioritizes stronger signals. Overall, the conversation emphasizes the importance of frequency management in satellite communication systems to mitigate swamping effects.
elemis
Messages
162
Reaction score
1
What would happen if an uplink and downlink had equal frequency. I know 'swamping' would occur, but what IS swamping ? Would stationary waves be set up ?
 
Engineering news on Phys.org
The local strong signal transmitter would totally desensitise the receiver and as a result the receiver would "hear' little to nothing of the weak signal coming from a distance

ie. the transmitter would "swamp" the receiver

cheers Dave
 
davenn said:
The local strong signal transmitter would totally desensitise the receiver and as a result the receiver would "hear' little to nothing of the weak signal coming from a distance

ie. the transmitter would "swamp" the receiver

cheers Dave
I'm not sure I fully comprehend.

You're saying that the transmitter would somehow overpower a receiver thousands of kilometres away from it ?

How is that possible when microwaves are line of sight ?
 
No... the local transmitter would desense the local receiver so that the local receiver be it the ground station or the one in the satellite wouldn't be able to hear anything from the distance.

Each local transmitter would desense each local receiver ok ? :)

being line-of-sight is irrelevent

Dave
 
Is it mainly because the receiver usually has AGC which turn the gain down in presence of a strong signal of the desired frequency? So it won't "see" the weaker signal from farther away?
 
That would also play a part

So it all goes to show why they use separate uplink and downlink freqs so they don't have all those engineering hassles :)

Dave
 
One ground station could overpower the receiver of a satellite receiving another station if the stronger station was also in the beam-width of the satellite's antenna and much stronger.

Stations on the ground are unlikely to directly interfere with each other since both would be using dish antennas pointing upwards toawrd the satellite. So they would not be in each other's antenna pattern.
 
davenn said:
That would also play a part

So it all goes to show why they use separate uplink and downlink freqs so they don't have all those engineering hassles :)

Dave

What is the other reason of desensitizing?
 
yungman said:
What is the other reason of desensitizing?

It depends on the type of modulation used.

FM receivers have a capture effect where only the strongest signal is followed. So, you don't even hear the weaker signal.

Data signals use a lot of different modulation systems, but if you don't get FM capture, the best you will get is corrupted data where you get data from both ground transmissions.

A satellite will usually get a very weak signal from both ground stations, because of the distance, and most receivers have delayed AGC, meaning that no AGC is applied for very weak signals in order to get best sensitivity.
So, AGC desensitization due to ground stations is not normally a problem .

Single sideband transmissions would have the problem of just producing two voices on top of each other with the stronger one being louder. So it would be hard to hear the weaker one while the stronger one was talking.
 

Similar threads

Back
Top