Energy of an electromagnetic wave

  1. Mar 27, 2013 #1
    So I've seen this equation used to calculate the energy associated with an electromagnetic wave:


    E is energy, h is a constant, and f is frequency. Therefore, the energy is related to frequency alone. However, why wouldn't the amplitude of that wave have an effect on the energy? I don't get it. If you have an extremely intense radio wave that requires much more energy to generate than waves such as your cell phone, why doesn't this equation show the effect of amplitude?
  2. jcsd
  3. Mar 27, 2013 #2
    Because when you take amplitude into account is like adding E=hf. E is the energy in one "em" wave. So with higher amplitude you are having more photons or "em" wave quanta which adds up.
  4. Mar 27, 2013 #3
    Do you know exactly how they add up?
  5. Mar 27, 2013 #4


    User Avatar

    Staff: Mentor

    They add linearly. Ten times as many photons is ten times as much energy. Is that what you were asking?
  6. Mar 27, 2013 #5
    Yes, that was what I was asking.. in a way. If you had an amplitude that's 10x the magnitude of one photon.. would you add ten photons? I'm a noob.
  7. Apr 9, 2013 #6
    The energy in an electromagnetic field is proportional to the square of the field strength (amplitude). So for coherent light, doubling the amplitude takes four times the energy and four times the number of photons, all else being equal. The rules for incoherent light are different, doubling the mean amplitude then takes just double the energy, and we use time-average values.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted