1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Do electromagnetic waves lose energy as they increase their distance?

  1. May 31, 2010 #1
    If electromagnetic waves loose energy as the increase their distance, is there an
    equation that reflects the loss of energy?
    Last edited: Jun 1, 2010
  2. jcsd
  3. Jun 1, 2010 #2


    User Avatar
    Science Advisor

    The wave itself does not lose energy, but since it necessarily spreads out, this will result in a smaller energy/area as the distance from the source increases.
  4. Jun 1, 2010 #3
    Would converter the electromagnetic wave light conserve the waves energy since it is no longer spreading out?
  5. Jun 1, 2010 #4
    It's true that as the wave spreads out the energy density decreases.
    So the further the receiver is from the source the lower the available energy density is.

    In a pure vacuum the energy density decreases inversely as the square of the distance from the source. This is known as the inverse square law. But none is actually lost.

    But it also depends upon the medium the EM wave is travelling through and on the frequency of the EM wave itself. Some frequencies will interact with some transmission mediums and be attenuated or changed to other frequencies.

    There is no simple equation, applicable to all EM frequencies, to apply. We often use an attenuation coefficient for a small range of frequencies. This has appears as a ratio not a linear relationship.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook