Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Electromagnetic wave attenuation

  1. Oct 8, 2009 #1
    Hi,
    I am just curious; do EM waves attenuate in a vacuum? If yes, how does this happen? Also, how do they faint through a medium?
     
  2. jcsd
  3. Oct 9, 2009 #2
    EM waves do not attenuate at all in a vacuum for a simple reason - conservation of momentum and energy. EM waves carry energy and if they attenuated this energy would have to be transferred to something else, but in a vacuum there is nothing else!

    I assume by faint you mean dissipate. EM waves interact with the charged particles in the medium (usually electrons and protons) exchanging momentum and energy. The majority of the energy will go to random motion of the particles in the medium; that is be dissipated as heat.
     
  4. Oct 9, 2009 #3
    But what about http://en.wikipedia.org/wiki/Free-space_path_loss" [Broken]? Isn't the spreading out of an E/M Wave considered to be a type of attenuation?
     
    Last edited by a moderator: May 4, 2017
  5. Oct 10, 2009 #4
    Propagation Constant = Attenuation constant(Real) + Phase Constant(Imaginary).
    For the above mention, attenuation is min if not zero for vacuum.

    Power loss due to spreading is not consider an attenuation as conservation of energy still holds for a given solid angle.
     
  6. Oct 11, 2009 #5
    So that makes the equation [tex]Propagation Constant = Phase Contant*i[/tex]. Does this explain a point's energy loss as the E/M wave spreads out?
     
  7. Oct 11, 2009 #6
    No ... that equation ensures that the wave propagates.

    a sphere have a surfacce area of 4*pi*r*r . As the wave spreads r increase, thus the surface area increase. The initial point source energy is now integrated over a larger area.
     
  8. Oct 11, 2009 #7
    Ok, got it now. But, speaking for a path, as the wave propagates, it does lose energy, doesn't it?
     
  9. Oct 12, 2009 #8

    Born2bwire

    User Avatar
    Science Advisor
    Gold Member

    Not in a lossless medium. In a lossless medium, the wave's energy gets spread out as the wavefront expands out in space, which is the free space loss factor. Only in a lossy medium will the wave actually be attenuated as it propagates.
     
  10. Oct 12, 2009 #9
    Yes, I can understand that. However, what happens when we measure the energy of points that lie on a line which crosses the E/M wave's source? Won't we spot a reduction of energy as we move further away the source?
     
  11. Oct 12, 2009 #10
    Hello petermer-
    EM waves do not "lose energy" in a vacuum, but they do disperse, due to the initial beam divergence. Consider a laser beam with a divergence of 1 minute of arc trying to illuminate a spot on the moon. By the time it gets there, the laser beam is 70 miles wide. Also it is hard to communicate with spacecraft leaving the solar system, because the radio cummunication beam has diverged significantly and the power density (watts per square meter) is extermely low.
    Bob S
     
  12. Oct 12, 2009 #11

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    It's interesting that losses due to absorption relate exonentially to the path distance whilst 'spreading losses' (inverse square) are less severe for long paths.

    I.e. The absorptive loss in dB is proportional to the dB loss per metre but the spreading loss only decreases by 6dB every time you double the distance.

    The loss through cable or even optical fibre will always, eventually, be worse than the loss through 'free space' as distances get bigger and bigger. Very lucky for space exploration.
     
  13. Oct 12, 2009 #12

    Born2bwire

    User Avatar
    Science Advisor
    Gold Member

    Any finite source will eventually succumb to space loss, so depending on how you plot your line of measurements, you will measure successively decreasing energy levels.
     
  14. Oct 13, 2009 #13
    Ok, you all covered my question, thanks. I've got another relevant question though: Do parabolic antennae (in vacuum) have zero spreading, thus zero energy loss on a path that crosses the source point?
     
  15. Oct 13, 2009 #14

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    No. You may produce a nominally parallel beam by putting the feed at the focus of the dish but it must diverge due to diffraction. In the end, the inverse square law will always kick in because the source will behave like a point when you are far enough away.
     
  16. Oct 14, 2009 #15
    But how can the inverse square law hold when talking about a line?
     
  17. Oct 14, 2009 #16

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    What I mean is that the 'parallel' beam will, in fact, end up diverging as if from a point some (possibly large) distance behind the actual source. The inverse square spreading will act as if the source were at this point. At a large enough receiving distance, the difference in actual distance and 'virtual' distance becomes negligable.
     
  18. Oct 14, 2009 #17

    Born2bwire

    User Avatar
    Science Advisor
    Gold Member

    Any finite source will "look" like a point source from far enough away. Eventually, no matter how highly directional the original beam was, it will always be hit with space loss factor. No finite source can give you a perfectly focused beam of radiation.
     
  19. Oct 15, 2009 #18

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    It is not necessary for the radiation to be spreading out in a sphere - just a cone will do for the inverse square law to apply. After all, parts of a spherical surface don't 'know' what the others are doing.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Electromagnetic wave attenuation
  1. Electromagnetic wave (Replies: 6)

  2. Electromagnetic waves (Replies: 1)

Loading...