- #1
mariano54
- 21
- 0
Hi, I am not a physicists and have been trying to understand some basic concepts about electromagnetic waves in the context of telecommunications.
Now, this is what I know so far: the energy of electromagnetic waves is proportional to it's frequency (E = h*f), and basically it's the energy carried by a single photon.
The intensity of an electromagnetic wave is the amount of power transferred per unit area, and can be described as the amount of photons * energy per photon.
IF these assumptions are correct, then the intensity could be increase in two ways: by increasing the frequency (energy per photon) or by increasing the voltage used to generate the wave (number of photons). Either of these two would yield more energy transferred and therefore higher intensity.
Is this reasoning correct?
Now, this is what I know so far: the energy of electromagnetic waves is proportional to it's frequency (E = h*f), and basically it's the energy carried by a single photon.
The intensity of an electromagnetic wave is the amount of power transferred per unit area, and can be described as the amount of photons * energy per photon.
IF these assumptions are correct, then the intensity could be increase in two ways: by increasing the frequency (energy per photon) or by increasing the voltage used to generate the wave (number of photons). Either of these two would yield more energy transferred and therefore higher intensity.
Is this reasoning correct?