- #1

2Tesla

- 46

- 0

**Wavefunction collapse ==> increase in entropy??**

I just read an article in Scientific American by Sean Carroll, called something like Does Time Run Backward in Other Universes. In it, he says that the reason wavefunctions only collapse and never un-collapse is because collapsing represents an increase in entropy, and therefore by the 2nd law of thermodynamics, a wavefunction can never un-collapse. This is strange to me for two reasons:

(1) Why does the 2nd law even apply, unless there are "hidden variables"? If the wavefunction encodes all possible information that can be known about the system, then there is no microstate to have more possible configurations in a given macrostate.

(2) How can the collapse of a wavefunction cause an increase in entropy, anyway? It seems to me that, depending at how you look at it, the entropy either stays the same (since we are simply going from one state to another, maybe the first was a superposition of basis vectors but maybe we can change basis to correct that) or even decrease (since we are losing information on how the state was prepared).

Note: this is not the only thing I found strange in the article :)

Any help with my understanding would be much appreciated.