Hello. I'm trying to understand electromagnetic radiation. Any help would be appreciated.

From my understanding a photon is a basic "unit" of electromagnetic radiation with an energy corresponding to hv. Additionally radios transmit electromagnetic waves by sending an AC through an antenna. The amount of EM waves transmitted is a function of the power supplied by the radio. As a thought experiment, if we continuously reduce the power supplied to the antenna, will we reach a point where there would be only one photon emitted (corresponding to when E=hv; P=hv/t)? Then will no radio waves be emitted below that power? What will happen if the antenna receives 1.5 times the power (when E=1.5hv; P=1.5hv/t); will there be only 1 photon emitted and the remaining energy lost as heat?

Thanks.

 PhysOrg.com physics news on PhysOrg.com >> Promising doped zirconia>> New X-ray method shows how frog embryos could help thwart disease>> Bringing life into focus
 Recognitions: Gold Member Science Advisor Yes, if you reduce the power enough you will experience granularity with the signal due to the fact that the radiation must be emitted using discrete quanta of energy. In answer to your other questions, one thing to note is that the antenna is actually excited by an electromagnetic wave. Any AC signal in an electrical circuit is propagated by electromagnetic waves. These waves excite the currents in the circuit (or the voltages that cause the build up of charges). So for you to send any amount of power to the antenna you will have to send at least a photon's worth of quanta otherwise there will be no wave to excite the currents on the antenna. The antenna itself will dissipate energy from conductive losses. In addition, efficiency will be reduced by reflection of energy off of the antenna due to impedance mismatch and further reduction in radiated power is due to the fact that there will invariably be energy trapped in the near field of the antenna as opposed to propagating out into space. These efficiencies, on a quantum level, will probably be modifiers to the statistical behavior of the transmitted signal. It is really no good to talk of the photons on an individual basis as any meaningful analysis and measurement will be done on a statistical basis, a large number of samplings. So in some instances the photon that is sent to the antenna will not be transmitted, it's energy will be lost as heat due to conductive losses. Or it's energy may be trapped in the near field. But on a statistical set of photons sent I would imagine that you would start to see measurements that would reflect the effective losses and inefficiencies of the antenna. Still, this is a bit of mixing with classical theory and when it comes to just singular photon behavior we need to think purely in the quantum regime for true accuracy.
 It is probably useful to consider the 1420-MHz (L-band) microwave signal as an electromagnetic wave, and also as the quantum hyperfine transition in hydrogen (like in interstellar gas). On one hand, antennas can easily transmit and receive these electromagnetic waves. On the other hand, hydrogen atoms can receive and emit the signal only as discrete quanta (photons). As the voltage on a 1420-MHz transmitting antenna is reduced, eventually the antenna will transmit a few photons per second. Bob S

Recognitions:
Gold Member