So I've seen this equation used to calculate the energy associated with an electromagnetic wave: E=hf E is energy, h is a constant, and f is frequency. Therefore, the energy is related to frequency alone. However, why wouldn't the amplitude of that wave have an effect on the energy? I don't get it. If you have an extremely intense radio wave that requires much more energy to generate than waves such as your cell phone, why doesn't this equation show the effect of amplitude?
Because when you take amplitude into account is like adding E=hf. E is the energy in one "em" wave. So with higher amplitude you are having more photons or "em" wave quanta which adds up.
They add linearly. Ten times as many photons is ten times as much energy. Is that what you were asking?
Yes, that was what I was asking.. in a way. If you had an amplitude that's 10x the magnitude of one photon.. would you add ten photons? I'm a noob.
The energy in an electromagnetic field is proportional to the square of the field strength (amplitude). So for coherent light, doubling the amplitude takes four times the energy and four times the number of photons, all else being equal. The rules for incoherent light are different, doubling the mean amplitude then takes just double the energy, and we use time-average values.