- #1
- 5,779
- 172
Usually when studying thermodynamics and statistical mechanics of a macroscopic body one uses a "heat bath model" to define a temperature. In quantum mechanics one can assume that the heat bath has arbitrary low temperature.
When studying quantum electrodynamics one has vacuum fluctuations of the electromagnetic field (we know from the Casimir effect that these may have measurable consequences).
Is it possible and does it make sense to introduce a kind of "heat bath generated from these vacuum fluctuations"? Or ist it standard to regularize the vacuum fluctuation such that one always has T=0 in the vacuum? Does that mean that zero point energy never contributes to temperature? (like in a Fermi gas where at T=0 the energy is huge).
I do not even have a reasonable starting point; let's consider
[itex]Z = \text{tr}\,e^{-\beta H}[/itex]
Again the temperature is introduced by hand and not generated by fluctuations.
When studying quantum electrodynamics one has vacuum fluctuations of the electromagnetic field (we know from the Casimir effect that these may have measurable consequences).
Is it possible and does it make sense to introduce a kind of "heat bath generated from these vacuum fluctuations"? Or ist it standard to regularize the vacuum fluctuation such that one always has T=0 in the vacuum? Does that mean that zero point energy never contributes to temperature? (like in a Fermi gas where at T=0 the energy is huge).
I do not even have a reasonable starting point; let's consider
[itex]Z = \text{tr}\,e^{-\beta H}[/itex]
Again the temperature is introduced by hand and not generated by fluctuations.
Last edited: