Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Can Antenna reduce the radio signal strength when it receive

  1. Nov 28, 2016 #1
    • When antenna receive the radio signal does it reduce the actual radio signal strength or strength indeed depends on the dispersion loss and other dielectric absorption?
    • If antenna doesn’t reduce the radio signal strength(power), how it generates the current in the receiving antenna circuit? How antenna in this case is not affected by law of conservation - radio signal doesn’t loose its power even generating current in the antenna.
    • If antenna affects the signal strength of the radio signal, does more antenna means more power to transmit for transmitter?
  2. jcsd
  3. Nov 28, 2016 #2


    User Avatar
    Science Advisor
    Gold Member

    Yes, when an antenna receives a radio signal it is taking energy away from the radiowave. For most practical systems this is a tiny amount of energy, though. When I tune my FM radio to a signal, it has no significant effect on the signal strength available to my next door neighbor.

    I'm not sure what you are talking about here - I thought you were discussing a receive antenna. What transmitter are you referring to?

  4. Nov 28, 2016 #3


    User Avatar
    Gold Member

    The energy has already been emitted so it is irrelevant to the transmitter if you collect that energy or it just continues on by. This will not be true if you are close enough for near field effects to come into play. Then you are affecting the characteristics of the transmitting antenna.

  5. Nov 29, 2016 #4
    So if I have satellite transmitter and my transmitting power should cover only area depending upon the dispersion loss by inverse R^2. If i have increase number of antenna does my transmitting power needs to be increased since there are losses? In other scenario if I have a close room and I have a transmitter with a fixed power. If my number of antenna increase does the reception power decreases?
  6. Nov 29, 2016 #5


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    not in a measurable way ( well I would be surprised is it could be) ... there is still much more energy bypassing all those antennas (not being captured ) than what is captured.
    Now if you have antennas in a line, that is different as the rear antenna will be in the shadow of the front antenna ( the one closer to the transmitter) and it will of course receive less signal

    read again JasonRF's post and his first comment ... That is your answer
  7. Nov 29, 2016 #6


    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    That general principle certainly applies in most cases. There is an interesting exception to this principle, though. Medium Frequency (vertically polarised of course) ground waves will propagate along the ground and follow the curve of the Earth. The energy does not tend to carry out along a tangent but the wave front has a forward tilt, due to resistive losses in the Earth. Energy gets directed downwards, out of the region just above the surface. Cities, with lots of steel structures (and receiving antennae) have a definite shadow, downstream from a transmitter but the signal strength rises back up, after a few km to what you'd expect, if the city were not there.
    This forward tilt of a VP wave can be used by long wire antennae, supported on a series of short poles, radially in the direction of travel of the wave. Google Beverage Antenna; an excellent, cheap and cheerful directional MF antenna (OK if you happen to have a few hundred metres of land available).
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted