- #1
petergreat
- 267
- 4
If a quantum system is subjected to a time dependent Hamiltonian with one parameter lambda, then its entropy does not change provided lambda is changed slowly enough between two values. How can this be proved rigorously? The http://en.wikipedia.org/wiki/Adiabatic_theorem" [Broken] is not enough, since this system is different in that it constantly re-thermalizes itself to a Boltzmann distribution in the process.
I know this is a standard topic in any statistical physics course, but I haven't seen one which is mathematically fully convincing to me.
I know this is a standard topic in any statistical physics course, but I haven't seen one which is mathematically fully convincing to me.
Last edited by a moderator: