Electromagnetic waves: Antenna length

AI Thread Summary
Antenna length correlates with the wavelength of the electromagnetic waves they receive, with effective antennas typically being a quarter of the wavelength. TV and radio antennas are similar in length due to their proximity in frequency bands, while older cell phone antennas are shorter because they operate at higher frequencies around 1 GHz. The amplitude of the waves does not directly influence antenna size; rather, antenna gain and efficiency are more critical factors. Modern cell phones often utilize miniaturized antennas, which can be hidden within the device without compromising functionality. Understanding these principles is essential for teaching electromagnetic radiation effectively.
ap_cycles
Messages
35
Reaction score
1
Hi,

I come across these 2 questions on a website:

1. Why are the antenna on TV and Radios about the same length
2. Why is the antenna on the cellphone (i.e. the older versions) smaller than the Radio antenna?

My answers to these 2 questions are: Firstly, the waves that the cellphone detect are of a smaller wavelength than those detected by the radio or TV. If i am not wrong, the wavelength of an antenna is about one quarter the wavelength of the wave it receives.

Secondly, the amplitude of the waves. One can argue that the longer the antenna, the bigger will be the amplitude of the wave it receives. The max amplitude (i.e. max energy) the cellphone receives is smaller than that received by the TV or radio?

What do forummers think of my answers? I am quite confident of my 1st, though not so sure about the 2nd. Btw, i am a high school physics teacher about to deliver this topic on EM radiation soon. :shy:

(Off topic: do most modern cellphones have an antenna anyway? IF the answer is yes, won't hiding them within the chasis of the phone defeat the original purpose of a protruding antenna?)
 
Physics news on Phys.org
ap_cycles said:
Hi,

I come across these 2 questions on a website:

1. Why are the antenna on TV and Radios about the same length
2. Why is the antenna on the cellphone (i.e. the older versions) smaller than the Radio antenna?

My answers to these 2 questions are: Firstly, the waves that the cellphone detect are of a smaller wavelength than those detected by the radio or TV. If i am not wrong, the wavelength of an antenna is about one quarter the wavelength of the wave it receives.

Secondly, the amplitude of the waves. One can argue that the longer the antenna, the bigger will be the amplitude of the wave it receives. The max amplitude (i.e. max energy) the cellphone receives is smaller than that received by the TV or radio?

What do forummers think of my answers? I am quite confident of my 1st, though not so sure about the 2nd. Btw, i am a high school physics teacher about to deliver this topic on EM radiation soon. :shy:

(Off topic: do most modern cellphones have an antenna anyway? IF the answer is yes, won't hiding them within the chasis of the phone defeat the original purpose of a protruding antenna?)

You are correct that a good antenna will be a quarter wavelength in length for a monopole (antenna above a ground plane), or two quarter wavelength elements as a dipole antenna (two quarter wave conductors opposed to each other). But for the TV versus radio point, it depends on what radio band you are asking about. The US FM band is indeed close in wavelength to the US TV bands, so antennas for those are similar in size. The US AM band is much lower frequency, around 1MHz, so the transmitting towers are large, and the receiving antennas are ferrite rods wrapped with coils. Those receiving antennas for AM are not very efficient, but that's okay because the AM transmitters blast their signals at high power levels.

For cell phones, yes, their frequency bands are near 1GHz (as opposed to the mid 100MHz range for FM and TV), so their antennas will be shorter. And in the smaller/newer cell phones, they are using minature ferrite antennas. There are also some cool new advances with high-dielectric-constant antennas for miniaturized applications -- great stuff.

You can probably learn a lot by just reading through the antenna page at wikipedia.org. There are lots of links out of that page that are also good.
 
The amplitude of the wave bears in no part to the size of the antenna. The amplitude of the electromagnetic wave is NOT a disturbance in physical space. The only factor that we might consider related to amplitude is the gain of the antenna. That is, we may wish to design a more efficient antenna (which usually requires a larger antenna) to receive a signal that is low in power due to various noise sources.
 
The minimum desired amplitude of a received signal is roughly 10 times kTB, where k is Boltzmann's constant, T is temperature in kelvin, and B is bandwidth in Hz. kTB is roughly -114 dBm per MHz receiver bandwidth. The noise figure of the receiver should be less than 3 dB (the noise of the receiver input should be < 3dB above kTB). The length of the antenna in proportion to the quarter wavelength is an important measure of the antenna sensitivity.

Bob S

Note:
dBm = dB milliwatts
0 dBm = 1 mW
10 dBm = 10 mW
20 dBm = 100 mW
etc.
 
Last edited:
I was using the Smith chart to determine the input impedance of a transmission line that has a reflection from the load. One can do this if one knows the characteristic impedance Zo, the degree of mismatch of the load ZL and the length of the transmission line in wavelengths. However, my question is: Consider the input impedance of a wave which appears back at the source after reflection from the load and has traveled for some fraction of a wavelength. The impedance of this wave as it...
Back
Top