RF Amplification: Is it a Bad Idea?

  • Thread starter Thread starter robotnut
  • Start date Start date
  • Tags Tags
    Amplification Rf
Click For Summary
Using a 1-watt RF amplifier in a home Wi-Fi setup has resulted in worse performance compared to a stock 1-watt bridge, despite both devices using the same Atheros chip. The user initially experienced weak signal strength over a 1.2KM line of sight and attempted to enhance it with a two-way amplifier, which added 10dBi to receive sensitivity. However, the amplifier seems to amplify noise, leading to decreased reception and speed. The stock bridge ultimately provided better throughput and a stable connection. This raises the question of whether RF amplifiers are generally ineffective or if the specific amplifier used is faulty.
robotnut
Messages
5
Reaction score
0
My home is about 1.2KM LOS from work. I want to use my work's internet connection at home. I have a pole available as well. Right now from rooftop to rooftop, I do get a signal but it's weak and goes off sometimes. I'm using a 200MW wifi bridge. So at a recommendation of a friend, I got a 1watt amplifier to stay license free. It is a 2 way amp also adding 10dbi to the receive sensitivity. The manufacturer says it's linear, giving me 1 watt with 200mw input. The issue however is that with the amp, reception and speed gets even worse... I tried reducing my power to 100mw which would reduce the output of the amps as well but no luck. The performance is worse with the amp and directional antennas than with no amp and 9db omnis.
Anyway since I haven't been able to make it work, I got a new 1 watt bridge and that works very well. I'm getting 38mbps throughput and a full RSSI indicator with dishes on both rooftops.

The question, is this a bad amp, or trying to use RF amps is generally a bad idea? It's funny how the stock 1watt bridge works like a charm, but the amplified one doesn't at all even tho they're both built around the same atheros chip. I guess a 2 way amp also amplifies noise?
 
Engineering news on Phys.org
I am trying to understand how transferring electric from the powerplant to my house is more effective using high voltage. The suggested explanation that the current is equal to the power supply divided by the voltage, and hence higher voltage leads to lower current and as a result to a lower power loss on the conductives is very confusing me. I know that the current is determined by the voltage and the resistance, and not by a power capability - which defines a limit to the allowable...

Similar threads

Replies
3
Views
8K