# Intensity and frequency of radio waves

Hi, I am not a physicists and have been trying to understand some basic concepts about electromagnetic waves in the context of telecommunications.

Now, this is what I know so far: the energy of electromagnetic waves is proportional to it's frequency (E = h*f), and basically it's the energy carried by a single photon.

The intensity of an electromagnetic wave is the amount of power transferred per unit area, and can be described as the amount of photons * energy per photon.

IF these assumptions are correct, then the intensity could be increase in two ways: by increasing the frequency (energy per photon) or by increasing the voltage used to generate the wave (number of photons). Either of these two would yield more energy transferred and therefore higher intensity.

Is this reasoning correct?

Related Other Physics Topics News on Phys.org
jtbell
Mentor
Now, this is what I know so far: the energy of electromagnetic waves is proportional to it's frequency (E = h*f), and basically it's the energy carried by a single photon.
I would say rather, that the energy of a single quantum (photon) of an electromagnetic wave is proportional to its frequency via E = hf.

Usually, an electromagnetic wave "contains" many many many [...] many many photons, so its (total) energy is some very very very [...] very very large multiple of hf.

So is it logical to think that by increasing the frequency, the intensity at a distance A meters would be greater than with lower frequency?

sophiecentaur