# Quantum measurement and entropy

• A
• gptejms
In summary: summary, the entropy of the universe has not changed, but the entropy of the single qubit being measured has decreased.

#### gptejms

A quantum system goes from an uncertain to a certain state upon measurement.This indicates a decrease of entropy--is there a corresponding increase of entropy elsewhere(environment/observer)?Is there any work done on the system in the act of measurement?

gptejms said:
A quantum system goes from an uncertain to a certain state upon measurement.This indicates a decrease of entropy

From the perspective of the single variable being measured, the prior and posterior distributions can have a different entropy, but what variables are involved in your definition of the entropy of a "quantum system"? Are you only considering a single variable?

You can have a multitude of quantum systems(say qubits) that are being measured.The system's state is much more uncertain(bigger phase space?) before measurement than the system's state upon measurement.

gptejms said:
A quantum system goes from an uncertain to a certain state upon measurement.This indicates a decrease of entropy--is there a corresponding increase of entropy elsewhere(environment/observer)?Is there any work done on the system in the act of measurement?
The answer to both questions is - no. This is consistent with the 2nd law of thermodynamics, provided that you only compare one measured state with another measured state.

Above you compared a high entropy state before measurement with the final low entropy measured state. Those states belong to different categories (the second is measured while the first is not), so they should not be compared in a verification of the 2nd law.

It depends also on which notion of entropy you consider. From an information-theoretical point of view the von Neumann entropy is a natural concept, and that's the entropy used in statistical mechanics to describe the corresponding thermodynamical quantity. The von Neumann entropy is given by
$$S=-\mathrm{Tr} \hat{\rho} \ln \hat{\rho},$$
where ##\hat{\rho}## is the statistical operator of the system. For a pure state, where ##\hat{\rho}=|\psi \rangle \langle \psi |## you have ##S=0##.

Now if you make a measurement, it depends on what happens to the system. If you make an ideal and complete von Neumann filter measurement and only keep states of a particular outcome you have prepared the system in a new pure state, and the entropy doesn't change. If you make an ideal incomplete ideal filter measurement the entropy increases, because then you get a non-pure state.

atyy
gptejms said:
A quantum system goes from an uncertain to a certain state upon measurement.This indicates a decrease of entropy--is there a corresponding increase of entropy elsewhere(environment/observer)?Is there any work done on the system in the act of measurement?

Actually, collapse causes an increase in entropy. Quantum mechanics without collapse is deterministic: if you know the input state you know the output state. With collapse the output is probabilistic: single input state is mapped into multiple possible output states. The entropy of the output is higher than the entropy of the input.

When you learn the result of a measurement, it doesn't fix this problem. The entropy of the universe has still gone up. It's just that the mutual information between you and the system also went up. It's important to keep those two concepts (mutual information vs entropy of a process) separate in your mind.

Strilanc said:
Actually, collapse causes an increase in entropy. Quantum mechanics without collapse is deterministic: if you know the input state you know the output state. With collapse the output is probabilistic: single input state is mapped into multiple possible output states. The entropy of the output is higher than the entropy of the input.

When you learn the result of a measurement, it doesn't fix this problem. The entropy of the universe has still gone up. It's just that the mutual information between you and the system also went up. It's important to keep those two concepts (mutual information vs entropy of a process) separate in your mind.
The entropy of the system being measured has gone down--the information content of the qubit has reduced.By mutual information,do you mean the knowledge of the state of the system gained by the observer?

Demystifier said:
The answer to both questions is - no. This is consistent with the 2nd law of thermodynamics, provided that you only compare one measured state with another measured state.

Above you compared a high entropy state before measurement with the final low entropy measured state. Those states belong to different categories (the second is measured while the first is not), so they should not be compared in a verification of the 2nd law.
As long as I am talking about the entropy of the universe, why can't I compare the (total)entropy before measurement to that after measurement?

gptejms said:
As long as I am talking about the entropy of the universe, why can't I compare the (total)entropy before measurement to that after measurement?
The precise answer depends on the choice of interpretation of quantum mechanics. An answer that does not depend much on interpretation would be that the state before measurement represents possibilities, while the measured state represents the actual reality.

In fact, there is a simple classical analogue. Consider a coin flip experiment. Before you look at the coin, you don't know whether it is heads or tails. This uncertainty corresponds to entropy ##{\rm ln}2##. But when you look at it then you know whether it is heads or tail, which reduces entropy to ##{\rm ln}1=0##.

waves and change
vanhees71 said:
It depends also on which notion of entropy you consider. From an information-theoretical point of view the von Neumann entropy is a natural concept, and that's the entropy used in statistical mechanics to describe the corresponding thermodynamical quantity. The von Neumann entropy is given by
$$S=-\mathrm{Tr} \hat{\rho} \ln \hat{\rho},$$
where ##\hat{\rho}## is the statistical operator of the system. For a pure state, where ##\hat{\rho}=|\psi \rangle \langle \psi |## you have ##S=0##.

Now if you make a measurement, it depends on what happens to the system. If you make an ideal and complete von Neumann filter measurement and only keep states of a particular outcome you have prepared the system in a new pure state, and the entropy doesn't change. If you make an ideal incomplete ideal filter measurement the entropy increases, because then you get a non-pure state.
Just a question--what is the von Neumann entropy of a qubit?

Demystifier said:
The precise answer depends on the choice of interpretation of quantum mechanics. An answer that does not depend much on interpretation would be that the state before measurement represents possibilities, while the measured state represents the actual reality.

In fact, there is a simple classical analogue. Consider a coin flip experiment. Before you look at the coin, you don't know whether it is heads or tails. This uncertainty corresponds to entropy ##{\rm ln}2##. But when you look at it then you know whether it is heads or tail, which reduces entropy to ##{\rm ln}1=0##.
From the moment a coin is tossed to the point it falls on the ground, it obeys well defined Newton's laws and is in a definite orientation at all times.So, there is no uncertainty in its state.Besides, when it falls, work is done by the ground to stop its motion and bring about its orientation to up or down!

gptejms said:
So, there is no uncertainty in its state.
There is no uncertainty in its state, but there is uncertainty in your knowledge of its state. Some interpretations of QM say that quantum uncertainty is also uncertainty of your knowledge. Other interpretations deny it, but as I said, a detailed answer in the quantum case depends on the choice of interpretation.

Demystifier said:
There is no uncertainty in its state, but there is uncertainty in your knowledge of its state. Some interpretations of QM say that quantum uncertainty is also uncertainty of your knowledge. Other interpretations deny it, but as I said, a detailed answer in the quantum case depends on the choice of interpretation.
Two things--1. We can talk about the entropy of a system before measurement and after measurement.
2. The entropy reduces even for a classical system upon measurement--now is there any work done?

gptejms said:
1. We can talk about the entropy of a system before measurement and after measurement.
Sure, but we cannot measure entropy of a system before measurement. You have probably heard that measurement in QM plays a much more important role than measurement in classical mechanics.

gptejms said:
2. The entropy reduces even for a classical system upon measurement--now is there any work done?
No work is done. A work is associated with thermal entropy, while here we are talking about a reduction of a non-thermal entropy.

Demystifier said:
Sure, but we cannot measure entropy of a system before measurement. You have probably heard that measurement in QM plays a much more important role than measurement in classical mechanics.
When you said ln2 before coin toss, we did quantify entropy before measurement.

No work is done. A work is associated with thermal entropy, while here we are talking about a reduction of a non-thermal entropy.
So, non-thermal entropy can be reduced upon measurement.Thermal entropy(associated with phase space only) can not be reduced without work being done--right?

gptejms said:
When you said ln2 before coin toss, we did quantify entropy before measurement.
Quantification is not the same as measurement.

gptejms said:
So, non-thermal entropy can be reduced upon measurement.
It's true for some kinds of non-thermal entropies, not for any non-thermal entropy.

gptejms said:
Thermal entropy(associated with phase space only) can not be reduced without work being done--right?
Not quite. Are you familiar with the following thermodynamic relations?
$$dU=dQ-PdV$$
$$dS=\frac{dQ}{T}, \;\; dW=PdV$$

Demystifier said:
Not quite. Are you familiar with the following thermodynamic relations?
$$dU=dQ-PdV$$
$$dS=\frac{dQ}{T}, \;\; dW=PdV$$
Please explain further.

gptejms said:
Please explain further.
##dW## is the work, ##dS## is the change of entropy. Taking ##dW=0## gives ##dU=TdS##, meaning that entropy can change without work.

Demystifier said:
##dW## is the work, ##dS## is the change of entropy. Taking ##dW=0## gives ##dU=TdS##, meaning that entropy can change without work.
This is just the first law of thermodynamics--it doesn't prohibit unrestrained conversion of internal energy to dQ(which equals TdS only for an isothermal process).

Coming back to my original question, because of $$\delta p$$ and $$\delta q$$, a quantum system's phase space should be bigger than that of a classical system--so does it have a higher entropy?

gptejms said:
Coming back to my original question, because of ##\delta p## and ##\delta q##, a quantum system's phase space should be bigger than that of a classical system--so does it have a higher entropy?
The set of all quantum states is the Hilbert space, not the phase space. So one cannot conclude that it has a higher entropy.

Demystifier said:
The set of all quantum states is the Hilbert space, not the phase space. So one cannot conclude that it has a higher entropy.
We agreed that the thermal entropy is related to the phase space only.A quantum system is not without a phase space--so I am asking if a quantum system occupies a bigger phase space and has a higher thermal entropy?

gptejms said:
We agreed that the thermal entropy is related to the phase space only.
I didn't say that I agree with that. But I didn't explicitly complain about your statement because it didn't seem essential at that time.

Demystifier said:
I didn't say that I agree with that. But I didn't explicitly complain about your statement because it didn't seem essential at that time.
Logically correct, but it doesn't answer my question.

gptejms said:
Logically correct, but it doesn't answer my question.
As far as I know, there is no simple general answer. Quantum entropy is sometimes bigger and sometimes smaller than its classical counterpart.

gptejms said:
Just a question--what is the von Neumann entropy of a qubit?
The von Neumann entropy refers to state of a quantum system. So the correct question to ask is what's the Neumann entropy for a given state of a qubit. A qubit is a system with a two-dimensional Hilbert space of states. The most general state is given by a statistical operator, and we can work in a given basis (e.g., if we consider the spin of a spin-1/2 particle the standard eigenbasis ##|+1/2 \rangle, \quad -1/2,\rangle## of the operator ##\hat{\sigma}_3##). The statistical operator is given as a positive semidefinite hermitean matrix ##\rho_{ij}## with trace 1, and the entropy is defined as
$$S=-k_{\text{B}} \mathrm{Tr} \hat{\rho} \ln \hat{\rho}.$$
Now you can always diagonalize the matrix representing the statistical operator, then it's juse given by the corresponding matrix elements ##\rho_{ij}'=\mathrm{diag}(p,1-p)##, and the (basis independent!) entropy is given by
$$S=-k_{\text{B}} [p ln p + (1-p) \ln(1-p)].$$
Of course ##p \in [0,1]## and by definition for ##p=0## or ##p=1## you have to interpret ##x \ln x## for ##x \rightarrow 0## to be ##0##.

You can prove that you have maximal information about the system if and only ##p=0## or ##p=1##, i.e., if the qubit is prepared in a pure state. The entropy becomes maximal for ##p=1/2##, as is expected for thermal equilibrium.

vanhees71 said:
The von Neumann entropy refers to state of a quantum system. So the correct question to ask is what's the Neumann entropy for a given state of a qubit. A qubit is a system with a two-dimensional Hilbert space of states. The most general state is given by a statistical operator, and we can work in a given basis (e.g., if we consider the spin of a spin-1/2 particle the standard eigenbasis ##|+1/2 \rangle, \quad -1/2,\rangle## of the operator ##\hat{\sigma}_3##). The statistical operator is given as a positive semidefinite hermitean matrix ##\rho_{ij}## with trace 1, and the entropy is defined as
$$S=-k_{\text{B}} \mathrm{Tr} \hat{\rho} \ln \hat{\rho}.$$
Now you can always diagonalize the matrix representing the statistical operator, then it's juse given by the corresponding matrix elements ##\rho_{ij}'=\mathrm{diag}(p,1-p)##, and the (basis independent!) entropy is given by
$$S=-k_{\text{B}} [p ln p + (1-p) \ln(1-p)].$$
Of course ##p \in [0,1]## and by definition for ##p=0## or ##p=1## you have to interpret ##x \ln x## for ##x \rightarrow 0## to be ##0##.

You can prove that you have maximal information about the system if and only ##p=0## or ##p=1##, i.e., if the qubit is prepared in a pure state. The entropy becomes maximal for ##p=1/2##, as is expected for thermal equilibrium.
The von Neumann entropy is maximal for p=1/2 (and so is its information content)--this entropy reduces upon measurement.

Demystifier said:
Quantum entropy is sometimes bigger and sometimes smaller than its classical counterpart.

What would these cases be?

gptejms said:
The von Neumann entropy is maximal for p=1/2 (and so is its information content)--this entropy reduces upon measurement.
No, maximum entropy means you have the least possible information about the system. Entropy is a measure for missing information. Of course, if you prepare the system in any other than this equilibrium state its entropy gets smaller, i.e., there's less missing information. For a very good introduction to the information-theoretical approach to statistical physics, see

https://www.amazon.com/dp/0199595062/?tag=pfamazon01-20

Demystifier
vanhees71 said:
No, maximum entropy means you have the least possible information about the system. Entropy is a measure for missing information. Of course, if you prepare the system in any other than this equilibrium state its entropy gets smaller, i.e., there's less missing information. For a very good introduction to the information-theoretical approach to statistical physics, see

https://www.amazon.com/dp/0199595062/?tag=pfamazon01-20
I don't have access to the book.I think 'information about the system' and 'information content' are two different things--I was talking about the latter.Interestingly, 'information about the system' involves a conscious and intelligent observer, and I've been toying with the idea/question:-is 'information about the system' + entropy of the system=constant?Let us look at this from the standpoint of the Maxwell's demon--the more information he is able to gain about the velocities of individual molecules, the more he can reduce the entropy of the system.

Of course, I'm talking about information in the information-theoretical sense, which has nothing to do with consciousness or vague philosophical ideas of this kind.

Maxwell's demon (particularly its quantum realization in cavity QED in recent works) for me is the prime example for the necessity to introduce information-theoretical methods into a full understanding of (quantum) statistical physics. Among other things these investigations clearly show the correctness of the concept of entropy in the sense of the Shannon-Jaynes-von Neumann entropy of statistical physics. Recently one has proven that indeed the maximum entropy of a qubit is ##k_{\text{B}} \ln 2##.

http://www.pnas.org/content/114/29/7561
https://arxiv.org/abs/1702.05161

https://www.nature.com/articles/s41567-018-0250-5

vanhees71 said:
Of course, I'm talking about information in the information-theoretical sense, which has nothing to do with consciousness or vague philosophical ideas of this kind.

Maxwell's demon (particularly its quantum realization in cavity QED in recent works) for me is the prime example for the necessity to introduce information-theoretical methods into a full understanding of (quantum) statistical physics. Among other things these investigations clearly show the correctness of the concept of entropy in the sense of the Shannon-Jaynes-von Neumann entropy of statistical physics. Recently one has proven that indeed the maximum entropy of a qubit is ##k_{\text{B}} \ln 2##.

http://www.pnas.org/content/114/29/7561
https://arxiv.org/abs/1702.05161

https://www.nature.com/articles/s41567-018-0250-5
From what I have roughly understood, memory entropy is greater than decrease of entropy of the system:-so,'information about the system/memory entropy' + 'entropy of the system' is constant or increasing--right?