Do electromagnetic waves lose energy as they increase their distance?

1. May 31, 2010

mapa

If electromagnetic waves loose energy as the increase their distance, is there an
equation that reflects the loss of energy?

Last edited: Jun 1, 2010
2. Jun 1, 2010

Nabeshin

The wave itself does not lose energy, but since it necessarily spreads out, this will result in a smaller energy/area as the distance from the source increases.

3. Jun 1, 2010

mapa

Would converter the electromagnetic wave light conserve the waves energy since it is no longer spreading out?

4. Jun 1, 2010

Studiot

It's true that as the wave spreads out the energy density decreases.
So the further the receiver is from the source the lower the available energy density is.

In a pure vacuum the energy density decreases inversely as the square of the distance from the source. This is known as the inverse square law. But none is actually lost.

But it also depends upon the medium the EM wave is travelling through and on the frequency of the EM wave itself. Some frequencies will interact with some transmission mediums and be attenuated or changed to other frequencies.

There is no simple equation, applicable to all EM frequencies, to apply. We often use an attenuation coefficient for a small range of frequencies. This has appears as a ratio not a linear relationship.