# B Does QM allow for entropy decrease of a closed system?

1. Oct 7, 2016

### Grinkle

I recently posted a statement that according to the 2nd law of thermodynamics, entropy always increases in a closed system, no matter how small the time interval is that one looks at.

I think that is true for classical thermodynamic models, please correct me if I am wrong.

Does quantum mechanics allow for or predict occasional entropy decreases of a closed system?

Maybe QM doesn't say anything one way or the other about entropy and the 2nd law?

2. Oct 7, 2016

### Khashishi

The von Neumann entropy is constant for a closed system.

3. Oct 7, 2016

### Grinkle

Thanks - that gives me a place to start reading. From an initial browse, the water gets really deep in a hurry ...

4. Oct 7, 2016

### newjerseyrunner

For a quantum closed system, the act of knowing the entropy, changes it.

My knowledge of Von Neumann entropy limited so someone please correct me if I'm wrong but it is wave function collapses (decoherence.) The collapses are one way and have no quantum fuzziness, so it can never decrease.

5. Oct 8, 2016

### Truecrimson

To decohere, the system must exchange information with an environment so it is no longer a closed system.

6. Oct 8, 2016

### Grinkle

Surely particles within a closed system decohere and exchange information with each other?

7. Oct 8, 2016

### stevendaryl

Staff Emeritus
Sort of. Decoherence is in some sense subjective. The actual physics involves a number of subsystems. There's the main system of interest (say, a single particle, or an atom or molecule), then there is whatever measuring devices are being used. Then there is the "environment", which is typically the electromagnetic field (but also might be particles in the atmosphere where the experiment takes place). Then there is the experimenters. Eventually, almost everything everything in the universe can be affected by the system of interest.

Since nobody wants to (or is able to) compute the quantum mechanical state of the whole universe, we tend to split up the universe into (1) the system of interest, and (2) everything else. Once the system of interest becomes irreversibly entangled with everything else, we say that decoherence has happened. At that point, rather than describing the system of interest as a pure state, we describe it as a mixed state (a density matrix), and the effect of everything else is only described statistically.

If you have a closed system of, say, 10 billion particles, it's impractical to describe the whole shebang as a pure state (although in principle, it could be done), so we would in practice use mixed states. I don't think that there is an objective sense in which "decoherence has happened"; it really depends on the experimenter deciding what is the "system of interest", and how entangled things have to be to be "irreversibly entangled". It's the same sort of fuzzy criterion in classical physics when you ask whether a process is irreversible. For a process involving two or three or twenty particles, no interaction is irreversible. But when you have processes involving millions or billions of particles, then at some point, it becomes more practical to consider some of them irreversible.

8. Oct 8, 2016

### Grinkle

So QM models end up with the same increasing entropy arrow-of-time arising from complexity as classical models?

I had a vague picture of a closed system of particles, 10 billion is a fine number to use, at complete classical thermal equilibrium, so 'heat dead' could describe this small universe. I wondered if quantum fluctuations could take the entire system to a point where classical energy movement via thermal differential might again happen. From your post, I take away that its not prohibited, but not any likelier than observing the shattered wine glass spontaneously re-assembling itself in the classical models.

9. Oct 8, 2016

### stevendaryl

Staff Emeritus
The classical and the quantum case are not exact analogies, but are very similar. In the same way that you can't have classical irreversibility with a small number of particles, you can't have quantum decoherence with a small number of particles, either. But whether a number is "small" or not is subjective, and reflects your ability to calculate as much as it does anything objective about the system.

10. Oct 8, 2016

### Strilanc

No, that's not true at all.

What you might be thinking of is that, if you measure a system, the revealed information removes some of your uncertainty about the state and lowers the Von Neumann entropy of your estimate of the system's state. But if you ignore the measurement result then of course your estimate's Von Neumann entropy won't decrease. Actually, an ignored measurement will often increase your estimate's Von Neumann entropy, because measurement can turn quantum superposition into classical noise.

The Von Neumann entropy you compute for a system does depend on how much you know, but that's just because entropy is a quantification of the uncertainty of an estimate. This is also true of classical entropy and classical systems.

11. Oct 9, 2016

### Simon Phoenix

Stevendaryl has provided an excellent answer in terms of large numbers of particles and made the important point that decoherence is a property of a subsystem. I want to go the other way and look at if we only had a small number of things. Let's keep it very simple and consider only 2.

In quantum optics there is an idealized model of a 2 level atom interacting with a single field mode in a high-Q cavity. It's known as the Jaynes-Cummings model (JCM) and in it both the atom and the field are treated quantum mechanically. It turns out that it isn't as fearsome as it sounds and it's a nice simple model that can be solved exactly.

The high-Q part is important because it's a model that supposes there is no dissipation, no leakage of radiation from the cavity, to the 'outside' world. Hence it's a closed quantum system consisting of 2 'objects', the atom and the field. So it is very much an idealized model.

OK - let's suppose we have our 2 level atom prepared in an excited state and the field in the vacuum state - that's our initial condition. As time develops the atom and field interact and the atom 'decays' to its ground state and the field picks up a photon, and then the process reverses - the atom picks up the photon and makes the transition to the excited state with the field returning to the vacuum. This oscillation between |e>|0> and |g>|1> continues indefinitely in this model for these initial conditions.

What do we mean by 'subsystem' here? Well if we only look at the properties of the atom, completely ignoring what's happening with the field, then we're treating the atom as a subsystem. In this perspective the atom begins as a pure state, evolves to a mixed state, and then evolves back to a pure state as the interaction proceeds - and this oscillation between pure and mixed just keeps on going.

In terms of the entropies (the von Neumann entropies) we have the total entropy of atom + field S(AF), and the subsystem entropies S(A) and S(F). The subsystem entropies are the entropy we get when we don't 'know' anything about the other subsystem. Quantum entropies, as determined by the von Neumann entropy, are not quite the same as the classical entropy. In our idealized model, for example, the total entropy S(AF) is a constant [the total entropy of a closed system is constant] and because we started in a pure state |e>|g> the total entropy remains zero throughout the entire evolution. It's this possibility of pure states in QM that give us the differences between classical entropies and von Neumann entropies.

The subsystem entropies are, however, time-dependent in the JCM. The atomic entropy oscillates between 0 and ln 2, as does the field entropy. Here's the thing; in the JCM, because the total system is in a pure state (always), the atom and field subsystem entropies are always equal to one another.

This is actually a general property. If we begin with 2 quantum systems A and B, where A could be comprised of any number of quantum objects, as could B, then if the total [AB] system is initially prepared in a pure state, and A and B interact (or component parts of A interact with component parts of B), then the subsystem entropies S(A) and S(B) are identically equal throughout the entire evolution - the total entropy remains constant ([AB] is a closed system) - but the subsystem entropies are time-dependent in such a way that they are equal to one another.

In order to get the 'decoherence' we have to think of A (or B) as a very large system, and then we would say, for example, that the interaction of A with an appropriately large system B leads to 'decoherence' of A - of course the total [AB] system has not 'decohered' - it's still in its pure state - but by this partitioning into 'system of interest' (such as A) plus 'large' system (such as B) we can figure out the properties of A by 'ignoring and smoothing' the details of what's happening with B and the [AB] system. The 'ignoring' is what we do when just consider A on its own, the 'smoothing' is like a classical coarse-graining procedure - and it's this process that leads to decoherence and irreversibility. It is, of course, a fudge - but for all practical purposes it's a damned good fudge

12. Oct 9, 2016

### Grinkle

13. Oct 9, 2016

### kith

There's an interesting difference between classical statistical physics and QM regarding the entropy of subsystems: In QM, if you have two systems which are entangled, the entropy of each of the subsystems is larger than the entropy of the whole system.

14. Oct 9, 2016

### Simon Phoenix

Yes, the QM entropy is a fascinating thing which gives rise to some curious properties - all traceable, I believe, to the existence of pure states in QM which have zero entropy.

For example, for classical entropy we have the mutual information being given by
I(A;B) = H(A) + H(B) - H(A,B) = H(A) - H(A|B) = H(B) - H(B|A)
the last two relations say that the mutual information between A and B is the difference between the uncertainty in A alone and the uncertainty in A given knowledge of B (and vice versa). If knowledge of B does not give us any information about A (that is, does not reduce our uncertainty in A) then the two systems are uncorrelated.

In QM these last 2 relationships are false. The mutual information in QM being given by S(A) + S(B) - S(A,B), which is the same as the first classical relationship. But I(A;B) is not always equal, in QM, to S(A) - S(A|B) or S(B) - S(B|A).

For me the most spectacular difference is expressed by the wonderful Araki-Lieb inequality which states that for 2 quantum systems A and B the entropies are bounded by
| S(A) - S(B) | ≤ S(A,B) ≤ S(A) + S(B)
The equivalent classical inequality for entropies is
sup [ H(A), H(B) ] ≤ H(A,B) ≤ H(A) + H(B) because the joint entropy, classically, can't be smaller than either of the 2 subsystem entropies - but it can be for quantum systems.

I've written it using S in the quantum case to emphasize that the von Neumann entropy cannot always be so simply related to uncertainty as it is in the classical case. In QM the conditional entropy can be negative - which makes it somewhat difficult to interpret the quantum von Neumann entropy purely in terms of uncertainties in the way we do for classical entropies.

Intriguing, but fundamental, I feel.