Wave Function Collapse and Entropy

In summary, when considering the naive approach to quantum mechanics, it is believed that the entropy of a system is affected by measurement in two opposing ways. On one hand, the process of decoherence increases entropy, while on the other hand, collapse reduces it. Therefore, the overall effect on entropy is the same and remains unchanged after a measurement. This is because a projective measurement on a pure state results in a pure state again, and the entropy of all pure states is zero. This can be seen by combining the processes of decoherence and collapse shown in Table 1. However, if a true collapse does exist, then entropy may decrease after a measurement, as shown in Tegmark's "observation" in the table.
  • #1
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,938
2,945
I don't want to argue about whether the notion of "wave function collapse" is a good way of understanding quantum mechanics, or not. For the purposes of this discussion, let's just adopt uncritically the naive approach to quantum mechanics, that:

  1. Between measurements, the system evolves according to Schrodinger's equation.
  2. A measurement always produces an eigenvalue of the operator corresponding to the quantity being measured.
  3. Immediately after a measurement, the wavefunction "collapses" to an eigenstate of that operator corresponding to the eigenvalue that you measured.

My question is: how is the entropy of a system affected by measurement? There is a sense in which it acts like a random perturbation, and so I would think that it would increase the entropy, but on the other hand, the state becomes more definite after a measurement, which would make me think that the entropy has been lowered.

Does my question make any sense, and if so, does it have a standard answer?
 
Physics news on Phys.org
  • #2
http://en.wikipedia.org/wiki/Density_matrix#Entropy
"This entropy can increase but never decrease with a projective measurement, however generalised measurements can decrease entropy.[6][7] The entropy of a pure state is zero, while that of a proper mixture always greater than zero. Therefore a pure state may be converted into a mixture by a measurement, but a proper mixture can never be converted into a pure state. Thus the act of measurement induces a fundamental irreversible change on the density matrix; this is analogous to the "collapse" of the state vector, or wavefunction collapse."

Reference 6 of the Wikipedia quote is Nielsen and Chuang.
Theorem 11.9 (p515): Projective measurements increase entropy.
Exercise 11.15 (p515): Generalised measurements can decrease entropy.
 
Last edited:
  • #3
It may help you to propose some example. What system? How exactly does the collapse happen? Which kind of entropy do you have in mind? Without that, I am afraid your question is too general.
 
  • #4
Entropy is a statistical concept: S = k ln W (to quote Bolzmann) wherein the W irepresents the number of possible microstates corresponding to the macroscopic state of a system. For a single particle, W = 1, so S = 0.
I have always found it useful to recall that mathematically a single 'particle' is best represented by a tensor field in Minkowski space. That representation simultaneously satisfies both QM and GR.
When a 'measurement' is performed on a 'particle' or a 'wave', a tensor operator is applied to the tensor field to extract the desired value from the set of variables participating in the tensor field. Whether the value is appropriate for a 'particle' or a 'wave' depends on the effect the operator has on the tensor.
In any case, the Dirac mathematics are used: the tensor product of the conjugate tensor with the result of the application of the operator to the original tensor, which is then fully integrated over Minkowski space to produce the value sought. Whichever operator or field is used, the operation irreversibly alters the original tensor (in 3-D this is called 'wave collapse') into something that resembles either a standing wave or a moving particle.
 
  • #5
I discuss the issue of 'collapse' in detail in my new book in terms of the Transactional Interpretation,

www.cambridge.org/9780521764155

In my proposal, wf 'collapse' is a form of spontaneous symmetry breaking. This would decrease the entropy relative to the perfectly symmetric state if you consider it analogous to the case, say, of a crystal forming in an amorphous solid.
 
  • #6
rkastner said:
I discuss the issue of 'collapse' in detail in my new book in terms of the Transactional Interpretation,

www.cambridge.org/9780521764155

In my proposal, wf 'collapse' is a form of spontaneous symmetry breaking. This would decrease the entropy relative to the perfectly symmetric state if you consider it analogous to the case, say, of a crystal forming in an amorphous solid.

I read about the Transactional Interpretation decades ago in Analog magazine. Its inventor, John Cramer, was also a columnist in that magazine. I thought it sounded fascinating, but I haven't heard anything much about it since, so I assumed that it wasn't taken seriously by most physicists.
 
  • #7
Hi Steven,

It is beginning to get more attention. I have been publishing papers on this; many of them are available on arxiv.org. Re the book, I have some introductory and preview material at my website, http://transactionalinterpretation.org/

Best regards
Ruth Kastner
 
  • #9
atyy said:
http://en.wikipedia.org/wiki/Density_matrix#Entropy
"This entropy can increase but never decrease with a projective measurement, however generalised measurements can decrease entropy.[6][7] The entropy of a pure state is zero, while that of a proper mixture always greater than zero. Therefore a pure state may be converted into a mixture by a measurement, but a proper mixture can never be converted into a pure state. Thus the act of measurement induces a fundamental irreversible change on the density matrix; this is analogous to the "collapse" of the state vector, or wavefunction collapse."

Reference 6 of the Wikipedia quote is Nielsen and Chuang.
Theorem 11.9 (p515): Projective measurements increase entropy.
Exercise 11.15 (p515): Generalised measurements can decrease entropy.
This increase of entropy by projective measurement assumes that a true collapse does not exist.
But if it does, then it actually DECREASES entropy. See e.g.
http://arxiv.org/abs/1108.3080
TABLE 1
 
  • #10
This thread is interesting, because opposite answers are given. I think given that stevendaryl explicitly asks about collapse, the answer is that entropy is the same after a measurement. When we perform a projective measurement on a pure state, the outcome is a pure state again and the (von Neumann) entropy of all pure states is zero.

Such a measurement can be devided into two parts: decoherence and collapse. The process of decoherence increases the entropy of the system and collapse reduces it. Nielsen and Chuang (cited by atty) only consider the first part, so there, entropy increases. But this doesn't produce a single outcome. Tegmark's "observation" in table 1 (cited by Demystifier) starts with the decohered state, so there, entropy decreases. But this doesn't cover the complete measurement process where we want to start with a pure state. If you combine the processes shown in table 1, my point that entropy is the same after the measurement is nicely illustrated.

stevendaryl said:
[...] but on the other hand, the state becomes more definite after a measurement, which would make me think that the entropy has been lowered.
It does become more definite only wrt to one basis, but becomes less definite wrt to other bases (consider spin measurements for example). So I don't think we should use this kind of reasoning.
 
  • #11
Demystifier said:
This increase of entropy by projective measurement assumes that a true collapse does not exist.
But if it does, then it actually DECREASES entropy. See e.g.
http://arxiv.org/abs/1108.3080
TABLE 1

That's the basis for my original question. It seemed that if a particle was in a mixture of a spin-up state and a spin-down state, which is a high-entropy state, and then I measure the spin and find that it is spin-down, then afterward it's in a low-entropy state (pure spin-down).
 
  • #12
kith said:
This thread is interesting, because opposite answers are given. I think given that stevendaryl explicitly asks about collapse, the answer is that entropy is the same after a measurement. When we perform a projective measurement on a pure state, the outcome is a pure state again and the (von Neumann) entropy of all pure states is zero.

Okay, but in classical thermodynamics, the entropy of a system is roughly the log of the volume in phase space of the set of points consistent with macroscopic observables. Can this classical notion of entropy be affected by measurements?
 
  • #13
stevendaryl said:
It seemed that if a particle was in a mixture of a spin-up state and a spin-down state, which is a high-entropy state, and then I measure the spin and find that it is spin-down, then afterward it's in a low-entropy state (pure spin-down).
That's right if you talk about an incoherent mixture of states (which corresponds to a diagonal density matrix). There, the situation is analogous to classical statistical mechanics.

But in this situation, there's no collapse. Collapse occurs only for superpositions, which are pure states and have zero entropy (analogous to points in phase space). If your initial state is a supersposition of a spin-up state and a spin-down state wrt to the z-axis, you can always find a measurement direction in such a way, that your state is an eigenstate to the corresponding spin operator. A measurement in this direction always yields the same result, hence your initial state already is a low entropy state. It's not meaningful to assign different entropies to superpositions and eigenstates, because the property of being an eigenstate depends on the basis. Every state is a superposition wrt to some basis.
 
Last edited:
  • #14
All the discussion above seems to be about von Neumann entropy. The von Neumann entropy is zero for any pure state. Hence, if the initial and final states are pure states (whatever the unobserved intermediate states are), then the entropy of the final state is the same as that of the initial state.

Then why does entropy in Nature increase with time? Because the entropy responsible for that is NOT the von Neumann entropy. To get an entropy which increases with time, one must introduce some COARSE GRAINING. It is the corresponding coarse grained entropy (and not the von Neumann entropy) which increases with time.
 
  • #15
Demystifier said:
All the discussion above seems to be about von Neumann entropy. The von Neumann entropy is zero for any pure state. Hence, if the initial and final states are pure states (whatever the unobserved intermediate states are), then the entropy of the final state is the same as that of the initial state.

Then why does entropy in Nature increase with time? Because the entropy responsible for that is NOT the von Neumann entropy. To get an entropy which increases with time, one must introduce some COARSE GRAINING. It is the corresponding coarse grained entropy (and not the von Neumann entropy) which increases with time.

My original post about entropy was not about von Neumann entropy, but about thermodynamic entropy.
 
  • #16
Demystifier said:
All the discussion above seems to be about von Neumann entropy. The von Neumann entropy is zero for any pure state. Hence, if the initial and final states are pure states (whatever the unobserved intermediate states are), then the entropy of the final state is the same as that of the initial state.

Then why does entropy in Nature increase with time? Because the entropy responsible for that is NOT the von Neumann entropy. To get an entropy which increases with time, one must introduce some COARSE GRAINING. It is the corresponding coarse grained entropy (and not the von Neumann entropy) which increases with time.

Would this be the same as considering that the density matrix is a reduced density matrix (ie. we can only access the state of a subsystem)? So for a sequence of pure states of the universe, the von Neumann entropy of the reduced density matrix would still increase?

(I guess I'm asking if integrating out is a good enough form of coarse graining.)
 
Last edited:
  • #17
atyy said:
Would this be the same as considering that the density matrix is a reduced density matrix (ie. we can only access the state of a subsystem)? So for a sequence of pure states of the universe, the von Neumann entropy of the reduced density matrix would still increase?
If there was nothing resembling the wave function collapse, then one could say that entropy of the subsystem increases, due to decoherence. But the fact is that something resembling the wave function collapse does exist (which decoherence by itself cannot explain, which is why decoherence does not completely solve the measurement problem). For that matter it is not important whether the collapse is related to consciousness (von Neumann), or happens spontaneously (GRW), or is only an illusion (many worlds, Bohmian, etc.), as long as it exists at least in the FAPP sense.
 
  • #18
atyy said:
(I guess I'm asking if integrating out is a good enough form of coarse graining.)
I think it's not.
 
  • #19
Demystifier said:
If there was nothing resembling the wave function collapse, then one could say that entropy of the subsystem increases, due to decoherence. But the fact is that something resembling the wave function collapse does exist (which decoherence by itself cannot explain, which is why decoherence does not completely solve the measurement problem). For that matter it is not important whether the collapse is related to consciousness (von Neumann), or happens spontaneously (GRW), or is only an illusion (many worlds, Bohmian, etc.), as long as it exists at least in the FAPP sense.

Demystifier said:
I think it's not.

I guess it's not obvious to me that the reduced density matrix doesn't involve collapse, since I've seen it said that the Born rule is implicitly used in getting it.
 
  • #20
Demystifier said:
If there was nothing resembling the wave function collapse, then one could say that entropy of the subsystem increases, due to decoherence.
But in most cases, there is nothing resembling wave function collapse. Decoherence occurs whenever the interaction between systems leads to entanglement and not only during measurements.

I'm still wondering how exactly this is related to the observable entropy increases. If we start with two pure states, I guess that any fundamental interaction that leads to maximal entanglement should begin to disentangle the systems afterwards. So I would expect an oscillating entropy for the systems (in classical mechanics, no entropy change arises from such a situation). Which of course would call for an explanation why our observations always take place in the rising entropy domain.
 
Last edited:
  • #21
atyy said:
I guess it's not obvious to me that the reduced density matrix doesn't involve collapse, since I've seen it said that the Born rule is implicitly used in getting it.
If you do an experiment, you get one definite outcome for your observable. The density matrix contains the probabilities for all possible outcomes, so it isn't the final state you perceive.
 
  • #22
atyy said:
I guess it's not obvious to me that the reduced density matrix doesn't involve collapse, since I've seen it said that the Born rule is implicitly used in getting it.
Mathematically, the reduced density matrix is obtained by partial tracing, which technically does not depend on the Born rule. The Born rule only serves as a motivation for doing the partial trace, but formally you can do the partial trace even without such a motivation.

A Born-rule-independent motivation for doing the partial trace is the fact that the evolution of the resulting object (reduced density matrix) does not depend on the whole Hamiltonian, but only on the Hamiltonian for the subsystem.
 
Last edited:
  • #23
kith said:
So I would expect an oscillating entropy for the systems (in classical mechanics, no entropy change arises from such a situation).
In both quantum and classical mechanics (based on deterministic Schrodinger and Newton equations, respectively), it is true that entropy will start to decrease after a certain time, and return arbitrarily closely to the initial state. However, in both cases, the typical time needed for for such a return is many orders of magnitude larger than the age of the Universe.

kith said:
Which of course would call for an explanation why our observations always take place in the rising entropy domain.
That's easy to explain. If we happen to live in an entropy-decreasing era, we will naturally redefine the sign of time accordingly, i.e., re-interpret it as an era in which entropy is increasing. That's because our brain, and ability to remember, is also determined by the direction in which entropy increases.

The good question is why the direction in which entropy increases is everywhere the same, i.e., why it is not the case that entropy increases in one subsystem and decreases in another? The answer is that it is interaction between the subsystems which causes them to have the same direction of the entropy increase:
http://arxiv.org/abs/1011.4173v5
 
  • #24
Demystifier said:
In both quantum and classical mechanics (based on deterministic Schrodinger and Newton equations, respectively), it is true that entropy will start to decrease after a certain time, and return arbitrarily closely to the initial state.
Are you talking about the von Neumann and Liouville entropies or about some course grained entropy? I don't see how the Liouville entropy may increase for a subsystem if it remains constant for the whole system.

Demystifier said:
If we happen to live in an entropy-decreasing era, we will naturally redefine the sign of time accordingly, i.e., re-interpret it as an era in which entropy is increasing. That's because our brain, and ability to remember, is also determined by the direction in which entropy increases.
Interesting, I haven't thought along these lines before. Can you recommend a not too technical article which expands on this? Also thanks for the link to your article, it looks quite promising. /edit: I'm skimming it right now and it probably simply is this reference I asked about ;-).
 
Last edited:
  • #25
kith said:
Are you talking about the von Neumann and Liouville entropies or about some course grained entropy? I don't see how the Liouville entropy may increase for a subsystem if it remains constant for the whole system.
I am talking about course grained entropy, of course.

kith said:
Interesting, I haven't thought along these lines before. Can you recommend a not too technical article which expands on this?
For example, Hawking explains it in his "Brief History of Time". I guess I cannot recommend a less technical literature than that.

A much more refined analysis of that stuff, but still very non-technical, is the book:
H. Price, Time's Arrow and Archimedes Point

Another good related non-technical book is:
D. Z. Albert, Time and Chance

There is also a good non-technical chapter on that in:
R. Penrose, The Emperor's New Mind
 
Last edited:
  • #26
Demystifier said:
Mathematically, the reduced density matrix is obtained by partial tracing, which technically does not depend on the Born rule. The Born rule only serves as a motivation for doing the partial trace, but formally you can do the partial trace even without such a motivation.

A Born-rule-independent motivation for doing the partial trace is the fact that the evolution of the resulting object (reduced density matrix) does not depend on the whole Hamiltonian, but only on the Hamiltonian for the subsystem.

If the Born rule isn't used, couldn't one just give an arbitrary reweighting of the sum over the environment and still get an object defined only on the subsystem (ie. is the averaging over the environment unique if the Born rule isn't used?)
 
  • #27
Demystifier said:
The good question is why the direction in which entropy increases is everywhere the same, i.e., why it is not the case that entropy increases in one subsystem and decreases in another? The answer is that it is interaction between the subsystems which causes them to have the same direction of the entropy increase:
http://arxiv.org/abs/1011.4173v5

To me, the mystery of entropy is not just that the arrows of time for all parts of the universe are the same, but that the thermodynamic arrow of time is aligned with the cosmological arrow of time. That is, for all parts of the universe, entropy decreases in the direction of the Big Bang.
 
  • #28
kith said:
If you do an experiment, you get one definite outcome for your observable. The density matrix contains the probabilities for all possible outcomes, so it isn't the final state you perceive.

Well, a nondeterministic theory cannot possibly describe the final outcome, it can only describe the set of possibilities and their associated probabilities.
 
  • #29
stevendaryl said:
Well, a nondeterministic theory cannot possibly describe the final outcome, it can only describe the set of possibilities and their associated probabilities.

The recipe of using the reduced matrix may not imply the collapse interpretation, but it seems that is as close as you can get.

The collapse interpretation says that initially the system is in some state [itex]\vert \Psi\rangle[/itex]. You perform an experiment to measure some observable with eigenvalues [itex]\lambda[/itex] and corresponding eigenstates [itex]\vert \Psi_\lambda\rangle[/itex] (for simplicity, assume non-degeneracy). Then the results are that afterward:

For every value of [itex]\lambda[/itex], there is a probability of [itex]\vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2[/itex] that the system is in state [itex]\vert \Psi_\lambda\rangle[/itex]

This is captured by the density matrix formalism as the transition

[itex]\vert \Psi \rangle \langle \Psi \vert \Rightarrow \sum_\lambda \vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2 \vert \Psi_\lambda \rangle \langle \Psi_\lambda \vert[/itex]
 
  • #30
I guess thinking about it classically, Demystifier's argument must be right. Measurement gives us more information, which is a reduction in entropy. Entropy increases when we forget, according to Landauer's exorcism of Maxwell's demon.

I guess what's not obvious to me is - how much coarse graining do we need, since the partial trace in getting the reduced density matrix is a form of coarse graining?
 
  • #31
Then Von Neumann Entropy and the Shannon Entropy are the same if we average over a time not so little so the relaxing time of the subsystems but little in comparation with the relaxation time of the complete system.

When finished the decoherence process, diagonal terms of density operators are null, then entropy grew, but when collapse matters, the entropy can decrease, in the extremal case to a pure state again, or a mix with no more entropy like before start decoherence process. If the measurement is not ideal, entropy can increase, obviously not to the entropy of the final of decoherence process, because then there is no information gained, and not measurement at all.
 
  • #32
Of course, the wave function is different if we considerate the collapse or not, and the post evolution is too. But the situation is different, if you consider the measurement realized, the posterior wavefunction give the probability conditionated to the measurement result, if you don´t considerate the collapse and considerate the total evolution, you only obtain the probability of measurements, not conditionate to no result of a measurament. All the problems are solved considering that wavefunction is only an instrument to calculate the probabilities of a measurement, in relation with the previous information, and not a real state of the system. The measurement problem doesn´t exist with this consideration, and collapse is perfectly ok
 
  • #33
StarsRuler said:
Of course, the wave function is different if we considerate the collapse or not, and the post evolution is too. But the situation is different, if you consider the measurement realized, the posterior wavefunction give the probability conditionated to the measurement result, if you don´t considerate the collapse and considerate the total evolution, you only obtain the probability of measurements, not conditionate to no result of a measurament. All the problems are solved considering that wavefunction is only an instrument to calculate the probabilities of a measurement, in relation with the previous information, and not a real state of the system. The measurement problem doesn´t exist with this consideration, and collapse is perfectly ok

I don't think that viewing the wave function as merely a matter of information is very satisfying. For one thing, there is interference between alternatives. What does it mean for information to interfere with other information? For another, when people talk about information, it's usually the case that they make a distinction between the state of the world and our information about that state. Bell's theorem shows that there is no sensible notion of "state" that the wave function could be about.

Something that I've thought about that in some ways makes the information-theoretic view more palatable to me, personally, is the "consistent histories" approach. You give up on describing what the state of the world is and the dynamics of that world state, and instead view the object of interest to be histories of observations. Quantum mechanics tells us the relative probabilities of those histories.
 
  • #34
stevendaryl said:
The recipe of using the reduced matrix may not imply the collapse interpretation, but it seems that is as close as you can get.

The collapse interpretation says that initially the system is in some state [itex]\vert \Psi\rangle[/itex]. You perform an experiment to measure some observable with eigenvalues [itex]\lambda[/itex] and corresponding eigenstates [itex]\vert \Psi_\lambda\rangle[/itex] (for simplicity, assume non-degeneracy). Then the results are that afterward:

For every value of [itex]\lambda[/itex], there is a probability of [itex]\vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2[/itex] that the system is in state [itex]\vert \Psi_\lambda\rangle[/itex]

This is captured by the density matrix formalism as the transition

[itex]\vert \Psi \rangle \langle \Psi \vert \Rightarrow \sum_\lambda \vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2 \vert \Psi_\lambda \rangle \langle \Psi_\lambda \vert[/itex]

See the density matrix above: in the transactional interpretation, what you get due to responses from all available absorbers is precisely that density matrix. Each weighted projection operator is an incipient transaction, and its weight is the Born Rule. This is true collapse in the there is no causal account for why one of the projection operators becomes the actualized outcome. However, TI provides a clear physical account of measurement as formulated in von Neumann's 'process 1' (the transition from pure state to density matrix), where no other interpretation can give an account of this other than a decision-theoretic one. The Bohmian theory appears to offer a solution, but it is far from clear that having a 'particle' in one of the channels translates to actualization of that outcome (see, e.g. critique by Wallace and Brown, 2005 ---ref on request).

What I can't understand is why there seems to be so much resistance to this obvious solution. Yes, it involves including advanced field solutions (with negative energies), but you can't get away from that in the relativistic domain anyway. So the basic message is that in order to solve the measurement problem, you need to take into account relativistic processes (emission and absorption) in a direct action picture (basic field propagation is time/energy - symmetric). This is very natural, since it is the most general theoretical formulation. Note that you can regain the empirical asymmetry of radiative processes with appropriate boundary conditions. Then the existence of those BC become theoretically falsifiable, which makes it a stronger theory methodologically -- in contrast to the standard approach in which an asymmetric field theory is simply assumed ad hoc.

See my new book on TI, in particular Chapters 3 and 4, for how TI solves the measurement problem. Sorry the book is rather pricey, but you can get it at many libraries and on interlibrary loan if interested. Also I will provide specially discounted, autographed copies for students with documented finanicial hardship. Contact me through my website to apply for this discount.
http://transactionalinterpretation.org/2012/10/09/to-contact-me/
 
  • #35
I dont´know details about the Bohmian interpretation. But there is no QFT theory for this interpretation. I don´t know if it is beacuse it is not possible or it is not clear that it is possible, but it is a pending question.
 

Similar threads

Replies
1
Views
624
Replies
1
Views
603
Replies
59
Views
3K
Replies
2
Views
421
  • Quantum Physics
Replies
24
Views
1K
Replies
75
Views
3K
Replies
8
Views
1K
  • Quantum Physics
Replies
4
Views
953
Replies
32
Views
2K
  • Quantum Physics
Replies
8
Views
1K
Back
Top