- #1
ande4jo
- 14
- 0
I expect that others have already asked and answers this question but I could not find it with Google searches. My thought of this apparent antenna reciprocity violation is per below.
Since antenna reciprocity states that an antenna will have same characteristics whether used a transmit antenna or a receive antenna, that means given an antenna with 28db gain (very focesed) used to transmit a radio signal from one such antenna to another antenna that also had 28 db gain used as a receive antenna and separated 10 wavelengths apart, that the receive signal would be larger than original signal by 14db.
Obviously if this could happen we would be getting energy for free. The math for this is very simple in that 2 antennas separated by 10 wavelengths in free space would only attenuate signal by 42 db (any off the shelf calculator on line can verify this and thus I won't show the math unless requested), yet the gain of TX and gain ov Rx antennas combined is equal to 56 db (28 + 28). Thus 56 db of gain less the 42db of loss gives a net result of 14db of free energy if this was actually achievable.
Note I chose the value of 10 wavelengths to keep receive antenna in far field only. Unless I am missing something the most probable violation would be in the antenna reciprocity assumption and obviously not conservation of energy assumption. Any thoughts to help me see what I am doing wrong with this scenario will be appreciated
Since antenna reciprocity states that an antenna will have same characteristics whether used a transmit antenna or a receive antenna, that means given an antenna with 28db gain (very focesed) used to transmit a radio signal from one such antenna to another antenna that also had 28 db gain used as a receive antenna and separated 10 wavelengths apart, that the receive signal would be larger than original signal by 14db.
Obviously if this could happen we would be getting energy for free. The math for this is very simple in that 2 antennas separated by 10 wavelengths in free space would only attenuate signal by 42 db (any off the shelf calculator on line can verify this and thus I won't show the math unless requested), yet the gain of TX and gain ov Rx antennas combined is equal to 56 db (28 + 28). Thus 56 db of gain less the 42db of loss gives a net result of 14db of free energy if this was actually achievable.
Note I chose the value of 10 wavelengths to keep receive antenna in far field only. Unless I am missing something the most probable violation would be in the antenna reciprocity assumption and obviously not conservation of energy assumption. Any thoughts to help me see what I am doing wrong with this scenario will be appreciated
Last edited by a moderator: