- #1

- 8,779

- 2,811

Now, a general argument known as the fluctuation-dissipation theorem says that if a system has a dissipative force, then there must be a corresponding fluctuation. If one applies the fluctuation-dissipation theorem to the Abraham-Lorentz force, one would get a fluctuation in the electromagnetic field, I assume.

If this is true, it's sort of amazing, because the Abraham-Lorentz force is derived from

*classical*electrodynamics. But it seems to point to a result from quantum field theory, that the electromagnetic field inherently has fluctuations.

Does anyone know of a paper working out the fluctuations predicted by the fluctuation-dissipation theorem applied to the electromagnetic field? How does the predicted fluctuations compare with the predictions made by QED?