Hi. I'd like to learn how to calculate the probability of a photon being emitted from a radio antenna where the energy per wavelength is below the threshold to emit photons. Let's assume the electrical thermal noise is insignificant. The antenna temperature could be sufficient low or the antenna could have sufficient resistance from resistors such that the thermal noise current is insignificant, Inoise = sqrt(4*k*T*B/R). I'm talking about placing actual resistors in the antenna, not the antennas radiation resistance. Higher resistance from resistors decreases the antennas thermal noise current. Let's say the antenna's AC current is set at a level such that it would produce one photon per wavelength on average. Now we decrease the power by half, which is half the required photon energy per wavelength. If we assume no appreciable thermal noise or quantum fluctuations, then the antenna would not emit photons. However, if quantum fluctuations can somehow occasionally push it beyond the minimum threshold to emit a photon, then how can I calculate this probability? Thanks for any help. I appreciate it.