Quantum computation and entropy

In summary, quantum gates must be reversible because information must be conserved, and entropy remains constant only if the process is reversible.
  • #1
antonantal
243
21
Quantum gates must be reversible.
The usual justification for this is that in QM the time evolution of a system is a unitary operator which, by linear algebra, is reversible (invertible).

But I am trying to get a better intuition of this, so I came up with the following explanation:
In order to maintain the superposition state, information must be conserved during quantum computation (note that this isn't the case at measurement, when we lose the superposition and we lose information). So a quantum gate must conserve information.
If information is conserved, then entropy must remain constant. The 2nd law of thermodynamics says that entropy remains constant only if the process is reversible. So, the process performed by a quantum gate must be reversible.

Is my reasoning correct?
 
  • Like
Likes vanhees71
Physics news on Phys.org
  • #2
antonantal said:
So, the process performed by a quantum gate must be reversible.

Is my reasoning correct?
It depends. The physical realization of the process does not need to be reversible. In fact, error correction only works because it is irreversible. But your reasoning is correct in that the logical process performed must be reversible.
 
  • Like
Likes antonantal and vanhees71
  • #3
gentzen said:
The physical realization of the process does not need to be reversible.
Aren't actual physical quantum gates unitary? Unitary == reversible.

gentzen said:
In fact, error correction only works because it is irreversible.
Why is this?
 
  • #4
In theory yes, but to isolate them from the environment to completely avoid decoherence is very difficult (if not impossible). Decoherence is a very effective "mechanism"!
 
  • #5
I think it is important to distinguish clearly between the abstract concept of quantum computing from the actual implementation of a quantum computer in reality.

Looking at a quantum circuit in an abstract way, computational steps using unitary gates only leave entropy invariant. However, there are many protocols (in other words algorithms) that make use of measurement gates and classical channels as well, e.g. the BB84 protocol for quantum key exchange. These of course destroy information due to their irreversibility by nature, so the entropy increases. The same applies to the stage where initial states are prepared, which is done by measurement gates also.

Now on the other hand, in reality the abstract concept of a quantum register where single qubits can be entangled forever while only subject to unitary transformation, is an idealization and not fulfilled to 100%. There are constant perturbations which eventually lead to decoherence ("noise"). This is why entropy in reality goes up nonetheless, and this is why error-correction protocols must be put in place to reduce the computational error caused by this.

All of the above is highly oversimplified of course, because the field of quantum information theory has developed significantly ove rthe last 20 years and goes far beyond what is normally covered in the typical 3--4 pages of a QM textbook.
 
  • Like
Likes vanhees71, gentzen and antonantal
  • #6
antonantal said:
Quantum gates must be reversible.
The usual justification for this is that in QM the time evolution of a system is a unitary operator which, by linear algebra, is reversible (invertible).
This is not necessarily true; there is a whole class of quantum computing that relies on non-reversible gates. These are the "one-way" or "measurement based" quantum computers.
I certainly not an expert; but from my understanding this is quite a popular method in photonic QC,

Just to complicate things further :wink:
 
Last edited:
  • Like
Likes vanhees71 and antonantal
  • #7
Related to my initial question, now I am trying to explain why at measurement we lose information and entropy increases:

Consider the system to be a qubit in state ##\Psi=\alpha|0\rangle+\beta|1\rangle##, with ##\alpha,\beta \in (0,1)##
Even though it is a superposition, it is still a single state, and we know it precisely (with probability 1) since it is a pure state. So, the entropy of the system is $$S=-k_B \sum p_i \ln(p_i)=-k_B 1 \ln(1) = 0$$
After the measurement, we can find the qubit in state ##|0\rangle## with probability ##\alpha^2## or in state ##|1\rangle## with probability ##\beta^2##. So, we now have a mixed state which is an ensemble of 2 states each with its probability. This means the entropy of the system is now $$S=-k_B \sum p_i \ln(p_i)=-k_B \left[ \alpha^2 \ln(\alpha^2) + \beta^2 \ln(\beta^2) \right] > 0$$ since ##\alpha^2, \beta^2 \in (0,1)##, so their logarithms are negative.
This shows that entropy has increased.

At the same time, information has been lost, because even though we have 2 possible states now (##|0\rangle## or ##|1\rangle##), each state contains less information (##\alpha## and ##\beta## have been lost).

Do you agree?
 
  • Like
Likes vanhees71
  • #8
Yes and no. What you describe are non-selective measurements in the terminology of quantum information theory. If you mean this, you are right.

If you describe a selective measurement by which you make sure that a pure state is prepared, the entropy after measurement is zero (as the state is pure). Having said this, I actually may have been wrong with my remark on the preparation of the initial system in my posting above.

However, I do not dare tread too far into this terrain as I don't consider myself an expert on this (as of today).

However, you have now teased me into reading that book which I had purchased some 20 years ago, and which now should actually be taken off the shelf: "Jürgen Audretsch: Entangled Systems". This seems to me the best starting point to study these questions...
 
  • Like
Likes antonantal and vanhees71
  • #9
gentzen said:
In fact, error correction only works because it is irreversible.
PeterDonis said:
Why is this?
In the circuit model, error correction works by first making measurements, and then applying corrections using different unitary gates based on the outcomes of the measurements. And measurements are irreversible.

But why should this irreversibility be necessary? Some of the errors caused by decoherence are irrevesible and increase the entropy, so another irreversible process is needed to decrease the entropy again.

But is this argument really convincing? To play devils advocate, the measurements and their results could be treated in MWI style, so that all possible measurement outcomes and their corresponding corrections would exist in a superposition. At the end of the computation, we still have to read out the result by a measurement, but that is fine as long as we only read out the result we are interested in, and not also many anicillary results corresponding in some way to all those intermediary measurements in superposition. I don't know at the moment whether it is necessary to read out many ancillary results (when using this MWI style error correction), but it seems like an answerable question to me.
 
  • Like
Likes vanhees71
  • #10
Well, for a real-world quantum computation, we have to read off the result in our "branch" of the universe. All the parallel worlds of the many-world interpretation are completely unobservable and irrelevant. I don't see any merit of this esoterical interpretation at all, the least for practical applications of QT!
 
  • Like
Likes Lord Jestocost
  • #11
vanhees71 said:
Well, for a real-world quantum computation, we have to read off the result in our "branch" of the universe.
For current superconduction based quantum computers, "to read off the result" is a messy operation with high error rates and bad cross-talk, typically worse than any other operation the quantum computer can do. So the minimal number of required measurements and their occurence in time during a compution is a valid question.

vanhees71 said:
All the parallel worlds of the many-world interpretation are completely unobservable and irrelevant. I don't see any merit of this esoterical interpretation at all, the least for practical applications of QT!
Ignore the esoterical many-worlds. In the context of quantum computation, MWI basically just means to push all measurements to the end of the computation. (You might interpret classical determinism as pulling all randomness to the beginning of the computation, but of course this doesn't work well in the context of quantum computation.) You certainly cannot get rid completely of the results of intermediary measurements. But if you could trade those for ancillary qubits to be left alone after their interactions (emulating the measurement), then you could still have improved the outcome of your quantum computation.

I now think that this actually works (i.e. that you don't need to measure any ancillary results). Those ancillary qubits simply serve as a reservoir into which the entropy can be pushed, allowing the main computation to get rid of the entropy increase caused by decoherence. Or to put it another way, those ancillary qubit serve as a reservoir of order to be used up during the quantum computation.

In conclusion, error correction must only get rid of the entropy, and being irreversible is not the only way to achieve this.
gentzen said:
I don't know at the moment whether it is necessary to read out many ancillary results (when using this MWI style error correction), but it seems like an answerable question to me.
In a certain sense, I still don't know, and especially I don't have a reference with the answer yet. But my expectation now is that it is not necessary, and that it should actually be quite easy to prove this, or find a reference which includes results in this direction.
 
  • Like
Likes vanhees71
  • #12
Let's say we start with a qubit in a pure state (0 entropy), and we are on a certain branch of the many-worlds. Then an error occurs due to decoherence. That is because decoherence causes our branch to split into multiple branches, and each new branch lost some information which is now on the other new branches. The qubit now has higher entropy since it is in a mixed state (of all the new branches).
Then, for error correction, are we actually trying to merge back all (or as many of) the new branches into one low-entropy branch that would contain all (or as much as possible) information from the initial branch?
 
  • #13
gentzen said:
another irreversible process is needed to decrease the entropy again.
Irreversible processes can't decrease entropy, they can only increase it.
 
  • #14
They can decrease entropy of a part of a larger system at the cost of producing more in another part.
 
  • #15
antonantal said:
for error correction, are we actually trying to merge back all (or as many of) the new branches into one low-entropy branch that would contain all (or as much as possible) information from the initial branch?
If decoherence has occurred, this is impossible, because it would amount to reversing decoherence, and decoherence is not reversible.
 
  • Like
Likes gentzen
  • #16
antonantal said:
After the measurement, we can find the qubit in state ##|0\rangle## with probability ##\alpha^2## or in state ##|1\rangle## with probability ##\beta^2##.
But after the measurement, we know which of the two states the qubit is in. The probabilities are things we calculate before the measurement, not things we observe after the measurement. (To check that our probability calculations are correct, we need to make a large number of measurements and do statistics on the results.)

antonantal said:
Do you agree?
No. See above.
 
  • #17
vanhees71 said:
They can decrease entropy of a part of a larger system at the cost of producing more in another part.
I don't see how that would apply in the case of error correction, since no "larger system" is involved, it's just the same set of quantum logic gates.

I think the answer is that error correction does not decrease entropy. As @gentzen said, error correction operations are unitary, and therefore do not change entropy at all.
 
  • Informative
Likes vanhees71
  • #18
antonantal said:
Then, for error correction, are we actually trying to merge back all (or as many of) the new branches into one low-entropy branch that would contain all (or as much as possible) information from the initial branch?
After my last comment, I would have said: "No, you don't try to merge the new branches back, because that would be "too difficult". Instead, you only try to protect some subspace from errors. And you protect it, by "measuring" the errors which actually occurred, and try to correct those errors with the help of some clean ancillary qubits."

But now this also doesn't sound right to me. Still, merging back new branches sounds basically impossible to me. Maybe what actually happens is some sort of entanglement swapping: The entanglement between the "escaped" information and the qubits in the subspace to be protected from error gets swapped to clean ancillary qubits, thereby removing it from the subspace. (But I am simply unsure. In the end, I am simply not a good substitute for a real QC expert like @Strilanc.)

PeterDonis said:
If decoherence has occurred, this is impossible, because it would amount to reversing decoherence, and decoherence is not reversible.
This is also my "feeling". So I decided to post my above answer "now", despite being unsure.
 
  • #19
PeterDonis said:
But after the measurement, we know which of the two states the qubit is in. The probabilities are things we calculate before the measurement, not things we observe after the measurement. (To check that our probability calculations are correct, we need to make a large number of measurements and do statistics on the results.)
Sure, after measurement we can see which of the two states occurred. But to calculate the entropy we need to consider the probability of each state occurring.
Similarly, after you toss a coin you can see if it is heads or tails, but to calculate the entropy you still consider the probability of each side occurring.
I would say entropy is calculated from the perspective of someone who did the measurement but didn't look at the result yet. So, it tells how much information he can find out once he looks.
 
  • #20
antonantal said:
I would say
Don't go by what you "would say". Do you have any actual references? Such as textbooks or peer-reviewed papers on the topic?
 
  • #21
antonantal said:
it tells how much information he can find out once he looks.
No, that's not what entropy (in its information theoretic interpretation, which is what we are talking about here) means. Entropy in this interpretation refers to information that is unavailable, i.e., irretrievably lost. Just as in the ordinary thermodynamic interpretation, entropy refers to energy that is unavailable for conversion into work, i.e., irretrievably lost.
 
  • Like
Likes vanhees71
  • #22
PeterDonis said:
Irreversible processes can't decrease entropy, they can only increase it.
Measurements are typically irreversible processes, and they are part of the "standard" error correction schemes.

PeterDonis said:
I don't see how that would apply in the case of error correction, since no "larger system" is involved, it's just the same set of quantum logic gates.
You always have the ancillas, which are already a larger system. And if you do an actual measurement on an ancilla, then an even larger system gets involved.
(The "set of quantum logical gates" are not the (sub)system, only the qubits are.)

PeterDonis said:
I think the answer is that error correction does not decrease entropy. As @gentzen said, error correction operations are unitary, and therefore do not change entropy at all.
The entropy of the universe is not decreased by error correction, but of course that is not the system of interest here. But error correction decreases the entropy of the subsystem that gets protected.
 
  • Like
Likes vanhees71
  • #23
gentzen said:
You always have the ancillas, which are already a larger system.
Do the error correction operations involve the ancillas? It didn't appear that way from your previous description; it appeared that those operations only operate on the qubits that are intended to store the desired information.
 
  • #24
gentzen said:
Measurements are typically irreversible processes, and they are part of the "standard" error correction schemes.
As you described it, measurements are used to tell which error correction operation to apply, but the actual error correction operations are unitary. If that is the case, the error correction operations can't change entropy, since no unitary operation can.
 
  • #25
antonantal said:
I would say entropy is calculated from the perspective of someone who did the measurement but didn't look at the result yet. So, it tells how much information he can find out once he looks.

As I mentioned in my earlier posting, this is what is called a non-selective measurement in quantum information jargon, compared to a selective measurement, as it is done for preparing the system in a pure state (with entropy zero).

As the conceptual part of a quantum circuit, a non-selective measurement changes the system from a pure state (if initially prepared as such) into a mixed state. The entropy is then calculated accordingly.

The main point is that a protocol using a quantum circuit may include classical channels as well, which normally imply measurement gates. Using these, the entropy calculated "a priori" uses the probability of an actual ("selective") measurement (according to the Born rule) for weighting the respective eigenstates of the observable measured in the resulting mixed state.
 
  • Like
Likes antonantal
  • #26
otennert said:
a protocol using a quantum circuit may include classical channels as well, which normally imply measurement gates. Using these, the entropy calculated "a priori" uses the probability of an actual ("selective") measurement (according to the Born rule) for weighting the respective eigenstates of the observable measured in the resulting mixed state.
Do you have a good reference on quantum computing that goes into all this in more detail?
 
  • #27
PeterDonis said:
No, that's not what entropy (in its information theoretic interpretation, which is what we are talking about here) means. Entropy in this interpretation refers to information that is unavailable, i.e., irretrievably lost. Just as in the ordinary thermodynamic interpretation, entropy refers to energy that is unavailable for conversion into work, i.e., irretrievably lost.
From Wikipedia:
"
Entropy measures the expected (i.e., average) amount of information conveyed by identifying the outcome of a random trial.
"

This is equivalent to what I have said before:
antonantal said:
it tells how much information he can find out once he looks.
 
  • #28
PeterDonis said:
As you described it, measurements are used to tell which error correction operation to apply, but the actual error correction operations are unitary.
Indeed, this is how the "standard" error correction schemes work.

PeterDonis said:
If that is the case, the error correction operations can't change entropy, since no unitary operation can.
The measurement operation on the ancilla qubit is part of the "standard" error correction scheme. And this operation is not unitary. And the ancilla bit gets entangled with the main qubits before the measurement operation, so ignoring it completely doesn't work either.
 
  • #29
gentzen said:
this is how the "standard" error correction schemes work.
So they are unitary? But:

gentzen said:
The measurement operation on the ancilla qubit is part of the "standard" error correction scheme. And this operation is not unitary.
So they are not unitary?

I'm confused.

This is why I keep asking for a reference.
 
  • #30
antonantal said:
From Wikipedia:
"
Entropy measures the expected (i.e., average) amount of information conveyed by identifying the outcome of a random trial.
"

This is equivalent to what I have said before:
No, it isn't. This illustrates the pitfalls of trying to learn from Wikipedia.

The "random trial" in the case being described means that we do not know the initial state of the system. The system is a classical system with unknown initial conditions; but its dynamics are deterministic, so once we know the result of the trial, we have learned, by implication, what the initial state of the system was. Actually we haven't learned the fully precise initial state, but we've learned enough about it to know that it was in the set of initial states that could produce the result we observed. The amount of information that is depends on the number of possible results, since that's what determines the number of sets of initial states that we are distinguishing among.

But in the case of the qubit prepared in a pure state and then measured, we do know the initial state of the system. We prepared it. And the system is not classical, it's quantum, so its dynamics are not deterministic, they're probabilistic. The usual view of why quantum measurement increases entropy is that it destroys information: it changes the system's state via a non-unitary, probabilistic process, so we can no longer tell what the original state was--since whichever result we get, ##\ket{0}## or ##\ket{1}##, could have been produced by many different measurements from many different initial states. So this case cannot be analyzed the same way as a classical random trial.

This is why I keep asking for a reference on quantum computation, so that we can ground this discussion based on how experts in the field actually deal with these kinds of issues.
 
  • #31
PeterDonis said:
So they are unitary? But:So they are not unitary?

I'm confused.

This is why I keep asking for a reference.
My understanding is that error detection is not unitary (involves measurement) but error correction is. Of course one could put both detection and correction under the same umbrella of "error correction".
 
  • #32
PeterDonis said:
Do you have a good reference on quantum computing that goes into all this in more detail?
Well the whole discussion teased me into re-starting the studying of "Jürgen Audretsch: Entangled Systems", which, to me, seems to be an excellent starting point for physicists with a good background on QM to study the basics of quantum information theory. I think at least this one I can recommend. But of course, there are now lots and lots of new books on that subject, the still classic one being "Nielsen/ Chuang: Quantum Computation and Quantum Information", which to Quantum Computing is like Jackson is to Classical Electrodynamics, as it were.
 
  • #33
PeterDonis said:
Don't go by what you "would say". Do you have any actual references? Such as textbooks or peer-reviewed papers on the topic?
Nielsen / Chuang: Quantum Computation and Quantum Information
chapter 11.3.3 Measurements and entropy
"Suppose, for example, that a projective measurement described by projectors ##P_i## is performed on a quantum system, but we never learn the result of the measurement. If the state of the system before the measurement was ##\rho## then the state after is given by $$\rho^{'}=\sum_{i} P_i \rho P_i$$
The following result shows that the entropy is never decreased by this procedure, and remains constant only if the state is not changed by the measurement"
Then it goes on to calculate the entropy of the state after measurement:
$$S(\rho^{'})=-tr(\rho^{'} \log\rho^{'})$$
I will now show that this is equivalent to the entropy that I have calculated in post #7 (except Boltzmann's constant).
The state before measurement is ##\Psi=\alpha|0\rangle+\beta|1\rangle##, so: $$\rho = | \Psi \rangle \langle \Psi | =
\begin{bmatrix}
\alpha^2 & \alpha\beta \\
\alpha\beta & \beta^2 \\
\end{bmatrix}
$$
The projectors ##P_i## are: $$
P_0 =
\begin{bmatrix}
1 & 0 \\
0 & 0 \\
\end{bmatrix}
, P_1 =
\begin{bmatrix}
0 & 0 \\
0 & 1 \\
\end{bmatrix}
$$
Then:
$$\rho^{'}=\sum_{i} P_i \rho P_i =
\begin{bmatrix}
\alpha^2 & 0 \\
0 & \beta^2 \\
\end{bmatrix}$$
Since the matrix is diagonal, its logarithm is:
$$\log\rho^{'} =
\begin{bmatrix}
\log(\alpha^2) & 0 \\
0 & \log(\beta^2) \\
\end{bmatrix}$$
So, the entropy is:
$$S(\rho^{'})=-tr(\rho^{'} \log\rho^{'}) = -\left[ \alpha^2 \log(\alpha^2) + \beta^2 \log(\beta^2) \right]$$
which is equivalent to what I have calculated before.
 
  • #34
antonantal said:
Nielsen / Chuang: Quantum Computation and Quantum Information
chapter 11.3.3 Measurements and entropy

Then it goes on to calculate the entropy of the state after measurement:
$$S(\rho^{'})=-tr(\rho^{'} \log\rho^{'})$$
I will now show that this is equivalent to the entropy that I have calculated in post #7 (except Boltzmann's constant).
The state before measurement is ##\Psi=\alpha|0\rangle+\beta|1\rangle##, so: $$\rho = | \Psi \rangle \langle \Psi | =
\begin{bmatrix}
\alpha^2 & \alpha\beta \\
\alpha\beta & \beta^2 \\
\end{bmatrix}
$$
The projectors ##P_i## are: $$
P_0 =
\begin{bmatrix}
1 & 0 \\
0 & 0 \\
\end{bmatrix}
, P_1 =
\begin{bmatrix}
0 & 0 \\
0 & 1 \\
\end{bmatrix}
$$
Then:
$$\rho^{'}=\sum_{i} P_i \rho P_i =
\begin{bmatrix}
\alpha^2 & 0 \\
0 & \beta^2 \\
\end{bmatrix}$$
Since the matrix is diagonal, its logarithm is:
$$\log\rho^{'} =
\begin{bmatrix}
\log(\alpha^2) & 0 \\
0 & \log(\beta^2) \\
\end{bmatrix}$$
So, the entropy is:
$$S(\rho^{'})=-tr(\rho^{'} \log\rho^{'}) = -\left[ \alpha^2 \log(\alpha^2) + \beta^2 \log(\beta^2) \right]$$
which is equivalent to what I have calculated before.
To be quite frank: I think the real issue under discussion here is not a calculational one, it is a conceptual one, and also one that has to do with terminology. The calculation above itself is trivial, but that's not the point I think.

Let me come back to your original question, which seemed to me a rather basic one: do you think it is answered to a better part?
 
  • #35
otennert said:
To be quite frank: I think the real issue under discussion here is not a calculational one, it is a conceptual one, and also one that has to do with terminology. The calculation above itself is trivial, but that's not the point I think.

Let me come back to your original question, which seemed to me a rather basic one: do you think it is answered to a better part?
It is not clear yet which is the correct justification why entropy increases after measurement.
Is it this:
antonantal said:
After the measurement, we can find the qubit in state ##|0\rangle## with probability ##\alpha^2## or in state ##|1\rangle## with probability ##\beta^2##. So, we now have a mixed state which is an ensemble of 2 states each with its probability. This means the entropy of the system is now $$S=-k_B \sum p_i \ln(p_i)=-k_B \left[ \alpha^2 \ln(\alpha^2) + \beta^2 \ln(\beta^2) \right] > 0$$ since ##\alpha^2, \beta^2 \in (0,1)##, so their logarithms are negative.
This shows that entropy has increased.
or this:
PeterDonis said:
The usual view of why quantum measurement increases entropy is that it destroys information: it changes the system's state via a non-unitary, probabilistic process, so we can no longer tell what the original state was--since whichever result we get, ##\ket{0}## or ##\ket{1}##, could have been produced by many different measurements from many different initial states.
The point of my previous post was to show that Nielsen / Chuang seem to have followed the same logic as I did.
 

Similar threads

Replies
8
Views
1K
  • Quantum Physics
Replies
14
Views
1K
Replies
1
Views
695
  • Quantum Physics
Replies
8
Views
1K
  • Quantum Physics
Replies
1
Views
697
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
3
Views
1K
Replies
4
Views
993
  • Quantum Physics
Replies
11
Views
2K
  • Quantum Physics
Replies
18
Views
3K
Back
Top