Using classical (non-quantum, that is) electrodynamics, one can predict that a charged particle accelerated by a nonuniform electric field will radiate. This can be modeled (although not without problems, such as unphysical runaway solutions) by the Abraham–Lorentz force, which is a dissipative force proportional to the rate of change of the acceleration. Now, a general argument known as the fluctuation-dissipation theorem says that if a system has a dissipative force, then there must be a corresponding fluctuation. If one applies the fluctuation-dissipation theorem to the Abraham-Lorentz force, one would get a fluctuation in the electromagnetic field, I assume. If this is true, it's sort of amazing, because the Abraham-Lorentz force is derived from classical electrodynamics. But it seems to point to a result from quantum field theory, that the electromagnetic field inherently has fluctuations. Does anyone know of a paper working out the fluctuations predicted by the fluctuation-dissipation theorem applied to the electromagnetic field? How does the predicted fluctuations compare with the predictions made by QED?