Intensity and frequency of radio waves

  • Thread starter mariano54
  • Start date
  • #1
21
0
Hi, I am not a physicists and have been trying to understand some basic concepts about electromagnetic waves in the context of telecommunications.

Now, this is what I know so far: the energy of electromagnetic waves is proportional to it's frequency (E = h*f), and basically it's the energy carried by a single photon.

The intensity of an electromagnetic wave is the amount of power transferred per unit area, and can be described as the amount of photons * energy per photon.

IF these assumptions are correct, then the intensity could be increase in two ways: by increasing the frequency (energy per photon) or by increasing the voltage used to generate the wave (number of photons). Either of these two would yield more energy transferred and therefore higher intensity.

Is this reasoning correct?
 

Answers and Replies

  • #2
jtbell
Mentor
15,659
3,729
Now, this is what I know so far: the energy of electromagnetic waves is proportional to it's frequency (E = h*f), and basically it's the energy carried by a single photon.
I would say rather, that the energy of a single quantum (photon) of an electromagnetic wave is proportional to its frequency via E = hf.

Usually, an electromagnetic wave "contains" many many many [...] many many photons, so its (total) energy is some very very very [...] very very large multiple of hf.
 
  • #3
21
0
So is it logical to think that by increasing the frequency, the intensity at a distance A meters would be greater than with lower frequency?
 
  • #4
sophiecentaur
Science Advisor
Gold Member
25,179
4,807
So is it logical to think that by increasing the frequency, the intensity at a distance A meters would be greater than with lower frequency?
Yes - if you made sure that your transmitter was radiating the same number of Photons per second. That is an unlikely scenario because transmitters tend to operate on the basis of Volts and Amps - i.e. Watts - in which case, for a given radiated power, the number of Photons would be inversely proportional to the frequency of the transmission.
 
  • #5
1,506
18
It is not difficult to calculate the number of photons emitted per second required to emit a particular power.
If a light bulb emits 100W (100J/s) of yellow light then by using E = hf (with f = FREQUENCY of yellow light) you can calculate the number of yellow photons emitted per second.
If it is 100W of gamma radiation you will get a vastly different answer.. You can usually detect individual photons of gamma radiation (clicks on a geiger counter) but you cannot easily detect individual yellow photons.
 

Related Threads on Intensity and frequency of radio waves

Replies
1
Views
3K
Replies
7
Views
829
Replies
1
Views
3K
Replies
6
Views
926
Replies
1
Views
4K
Replies
1
Views
1K
Replies
10
Views
1K
  • Last Post
Replies
15
Views
22K
  • Last Post
Replies
3
Views
2K
Top