I Is information lost in wavefunction collapse?

Click For Summary
The discussion centers on whether information is lost during wave function collapse in quantum mechanics (QM), drawing parallels to the black hole information paradox. Participants argue that while decoherence is theoretically reversible, wave function collapse appears to result in irreversible information loss, contradicting the principle that information should not be lost in closed quantum systems. The conversation highlights that standard QM does not inherently define measurement, leading to various interpretations, including the Copenhagen interpretation and Many Worlds Interpretation (MWI). Many physicists lean towards the belief that fundamental dynamics are unitary, suggesting that information is not truly lost, which diminishes the relevance of the collapse as an irreversible process. Ultimately, the debate reflects ongoing uncertainties in understanding the implications of measurement and information in quantum mechanics.
  • #91
atyy said:
If you read the OP and his clarifications in subsequent posts, you can see that he is asking for an answer within standard QM.

An answer to what question? I am saying the question he wants an answer to is the title question of this thread. And that question cannot be answered within standard QM, for reasons I've already explained. So if you're right that the OP is only interested in an answer within standard QM, then all we can tell him is that there isn't one.

atyy said:
I am not objecting to the discussion of interpretations as one part of the answer to this thread. I am objecting in interpretations being brought up as a primary answer

I am fine with that. I agree that the MWI is an interpretation, not standard QM, and can't be an answer to any question that asks what standard QM says.
 
  • Like
Likes bhobba
Physics news on Phys.org
  • #92
Apparently disagreeing with others (perhaps more knowledgeable); I think information is gained after a measurement. We are going from uncertainty to certainty. I don't know how others define "information" but I would take that as an increase in information. I think any form of Shannon's formula would agree, but I will write it out if requested. I view "measurement" as a "filter" that selects some particular future effects; i.e. determines the future.
 
  • #93
rrogers said:
I think any form of Shannon's formula would agree, but I will write it out if requested.

Within the thread, no one has yet offered a technical definition of information. What I get from the focus on unitary evolution (or a violation of it) is that a physical law specifying how the "state" of system changes from time t to time t+dt is considered to lose information if that law is a many-to-one-mapping. That definition defines "looses information" without specifying a quantitative measure of information. There's nothing wrong with such a definition from a logical point of view, but it would help to know explicitly if that's the definition that most participants have in mind.

The Shannon definition of information applies to a probability distribution, so it raises the question of what random variable you wish to look at. Various properties of a physical system can be measured. Measuring one property may increase the dispersion in a subsequent measurement of a different property. Applying the Shannon definition of information to a physical system is not straightforward.

The Shannon definition is related to the entropy of a probability distribution. Just stringing words together, there is such a thing as the "von Neumann entropy" in quantum statistical mechanics. There are also controversies about whether it is the best way to define entropy. Perhaps someone can comment on a relation between "information" as discussed in this thread and the various definitions of "entropy".
 
  • Like
Likes Boing3000 and rrogers
  • #94
Stephen Tashi said:
I'm curious why knowing more about something would be called a "loss" of information.

If a experiment is performed involving a probabilistic phenomena and the experimenter learns the outcome, why isn't this a gain in information?
I totally agree.

Let me state the other (ie, wrong) logic explicitly and as I understand it:
They are saying that before the measurement or collapse, there are many possible outcomes. But once the measurement is made, there is only one. They believe this could indicate a loss of information.

Before I attack that logic, let me say that I do not believe there is a change in the amount of information.

That said: Going from many possibilities to one is an increase in information. If I tell you that the killer has 012 as the first 3 digits of his social security number, that is some information - but there are still hundreds of thousands of possibilities. If I then said the first 5 digits are 012-34, then I have given you more information and thus left you with fewer possibilities.

I suspect that MWI is not an "interpretation" since, in its simplest form, it requires a continuous increase in the amount of information in the universe. Without it, an event can be identified by initial conditions (for example, Big Bang), three spatial coordinates and a time coordinate. With it, the event also requires a "which world" parameter.

The way to avoid this increase in information is to presume that, although it was theoretically impossible to know what the outcome would be, that it was never-the-less predestined - and that it was entirely determined from information that existed before the measurement was made.

The only alternative I see is to allow the amount of information to increase steadily. Then, to avoid "coin flipping", we would need to invoke either an "external" information source or MWI.
 
  • Like
Likes rrogers
  • #95
Stephen Tashi said:
Within the thread, no one has yet offered a technical definition of information. What I get from the focus on unitary evolution (or a violation of it) is that a physical law specifying how the "state" of system changes from time t to time t+dt is considered to lose information if that law is a many-to-one-mapping. That definition defines "loses information" without specifying a quantitative measure of information. There's nothing wrong with such a definition from a logical point of view, but it would help to know explicitly if that's the definition that most participants have in mind.

The Shannon definition of information applies to a probability distribution, so it raises the question of what random variable you wish to look at. Various properties of a physical system can be measured. Measuring one property may increase the dispersion in a subsequent measurement of a different property. Applying the Shannon definition of information to a physical system is not straightforward.

The Shannon definition is related to the entropy of a probability distribution. Just stringing words together, there is such a thing as the "von Neumann entropy" in quantum statistical mechanics. There are also controversies about whether it is the best way to define entropy. Perhaps someone can comment on a relation between "information" as discussed in this thread and the various definitions of "entropy".
Well, my model is simple, if I use a fluorescent screen and see an electron light up a spot I can then determine where the electron was at that moment and with careful measurement probably the energy. So I have gained information that affects all my future calculations; i.e. I have filtered my future. There may be other "universes" but they don't affect my future. Of course I might not look for a while and the delayed choice experiments come into play. But for my future, I have greater certainty (and probably increased my entropy some way) thus more information. A sort of Bayesian attitude if you will.
 
  • #96
rrogers said:
I have gained information that affects all my future calculations; i.e. I have filtered my future.

But you have lost information about your past; that is, if there are many possible past states that all could have led to your current state, the one you are using for your future calculations, then you have lost information. When physicists talk about non-unitary transformations (such as an actual physical wave function collapse) leading to loss of information in quantum mechanics, that is what they are talking about.
 
  • Like
Likes bhobba and rrogers
  • #97
PeterDonis said:
But you have lost information about your past; that is, if there are many possible past states that all could have led to your current state, the one you are using for your future calculations, then you have lost information. When physicists talk about non-unitary transformations (such as an actual physical wave function collapse) leading to loss of information in quantum mechanics, that is what they are talking about.
Yes, I have always said the past is as uncertain as the future in QM; a radical oversimplification. But taking a Bayesian attitude, information allows future certainty. Otherwise, when we take measurements we are destroying knowledge of the past; sort of a squishy conserved thing that disturbs me. But let's think about this; you/that implies that my ignorance in the past has more "information" than after I take the measurement. I suppose that's possible but it seems that "information" now has two different meanings/measures. Which is reasonable if it's given two names with a conservation law linking them. Like "Potential Energy" and "Kinetic Energy" I guess?
 
  • #98
An interesting thread. I have no math skills and am an avid fan. In my opinion many of the posts describing information were wide of the OP. The information the OP asks about exists only in the system to be measured. It has nothing to do with knowledge that an experimenter will gain, or probable outcomes, or what state the particle is in. There is an assumption that the system being measured contains information. Although this is reasonable, it is still only an assumption.
 
  • #99
rrogers said:
information allows future certainty

Not in general in QM, since QM only makes probabilistic predictions about the results of measurements. But if you know the result of a measurement you just made, using the state corresponding to that measurement result will give you better predictions about future measurements you can make than using the state before you made the measurement.

rrogers said:
that implies that my ignorance in the past has more "information" than after I take the measurement

You have more information about the past state before your current measurement, and less information about future measurements.
 
  • Like
Likes bhobba
  • #100
Stephen Tashi said:
Within the thread, no one has yet offered a technical definition of information. What I get from the focus on unitary evolution (or a violation of it) is that a physical law specifying how the "state" of system changes from time t to time t+dt is considered to lose information if that law is a many-to-one-mapping. That definition defines "looses information" without specifying a quantitative measure of information. There's nothing wrong with such a definition from a logical point of view, but it would help to know explicitly if that's the definition that most participants have in mind.

The Shannon definition of information applies to a probability distribution, so it raises the question of what random variable you wish to look at. Various properties of a physical system can be measured. Measuring one property may increase the dispersion in a subsequent measurement of a different property. Applying the Shannon definition of information to a physical system is not straightforward.

Thanks!

I've been reading this thread, wishing people would take your question on. What I would say, having played with quantum computer simulators, is that the information relayed to an *observer* in bits is something like the base 2 logarithm of the reciprocal of the probability of *observing* the event.

So a qubit "in" some prepared state doesn't actually carry the information needed to specify the prepared state, because that can't be observed from the qubit alone. So no information is lost on measurement.

But where does the probability space come from in physical systems? If I give you a full gigabit removable drive, it's a gigabit of information *to the drive* in that it treats all 2 to the billion possible states as equally likely, allocates equal resources to each one. But if you already knew the info on the drive, I've given you personally zero bits of information with the same drive, in that your internal model, unlike the drive, remains unchanged. How can this be mapped to physical systems I wonder? I feel it has something to do with changes.
 
  • #101
.Scott said:
I totally agree.

Let me state the other (ie, wrong) logic explicitly and as I understand it:
They are saying that before the measurement or collapse, there are many possible outcomes. But once the measurement is made, there is only one. They believe this could indicate a loss of information.

No, that is not what people are saying. What they are saying is that there is information in the initial state of a system that is lost when you make a measurement. If you have an electron that is in a superposition ##\alpha |u\rangle + \beta |d\rangle##, there is information in the coefficients ##\alpha## and ##\beta## which is (apparently) lost forever if you measure the spin.

@Stephen Tashi is right, that there can also be a gain of information in a measurement, but it isn't always the case. The paradigm case of gaining information is an entangled electron-positron pair. There is only one way to produce a spin-zero combination, so the information content of the entangled state is zero. But when you measure the spin of the electron (say), you get a bit of information, either spin-up or spin-down. So the information afterwards is more than beforehand.

The talk about unitary though is all about loss of information about the past.
 
  • #102
rrogers said:
Apparently disagreeing with others (perhaps more knowledgeable); I think information is gained after a measurement. We are going from uncertainty to certainty. I don't know how others define "information" but I would take that as an increase in information. I think any form of Shannon's formula would agree, but I will write it out if requested. I view "measurement" as a "filter" that selects some particular future effects; i.e. determines the future.

Yes, information can be gained, but it is not in contradiction to also losing information. See stevendaryl's post #101.
 
  • #103
I thought gamma, high freq, waves escape the collapse.in QM. Would that not be considered lost info?
 
  • #104
I think it depends on what "information" means. Einstein argued that before observation, you do not know; after observation you do. Now I realize that the argument that if the electron is in a superposition of waves then you have lost information, but did you ever have it? Think of the cat paradox. You can assert that the cat is in a superposition of quantum states, but you cannot know that. You can believe it, but what is the role of faith in physics? What we actually have is a number of mathematical relationships that give you the best description of what will happen in an event, but interpreting why they are enters awkward ground. In my opinion, the value of an interpretation lies in whether it can take you into new territory regarding prediction of outcomes, and I don't see resorting to arguments on information doing that. But I could be wrong
 
  • #105
Ian J Miller said:
I think it depends on what "information" means. Einstein argued that before observation, you do not know; after observation you do. Now I realize that the argument that if the electron is in a superposition of waves then you have lost information, but did you ever have it?

No, you didn't have it beforehand. In these discussions about loss of information, it's not about what people know. The ideal observer who never forgets what he observes would never lose information. That isn't the issue. The issue is whether the universe has lost information.

I suppose you could take the solipsistic view that the only information that exists is information in the minds of observers, but that's not what is meant in talking about information loss in quantum measurements or black holes.
 
  • #106
Then surely, taking your electron spin as an example, prior to observation, which is argued to determine the spin, because the spin is not determined the Universe does not know what will be determined, so information is created. Alternatively, the Universe could be argued to know that it has spin that will be determined one way or the other, but it doesn't know which so after determination it still knows there is spin, but it knows which. In this, "know" does not imply some sort of God; it is just I can't think of a better word. The question is, what is determined prior to observation as opposed to after.

Of course if the debate ends up with what does the formalism say, then I agree I am wrong and bow out.
 
  • #107
Ian J Miller said:
Then surely, taking your electron spin as an example, prior to observation, which is argued to determine the spin, because the spin is not determined the Universe does not know what will be determined, so information is created.

Yes, but more information is lost.
 
  • #108
Ian J Miller said:
prior to observation, which is argued to determine the spin, because the spin is not determined the Universe does not know what will be determined, so information is created.

This assumes that only one result happens, but that is interpretation dependent. In the many worlds interpretation, all results happen (each result for the measured system is correlated with the corresponding state of the measuring device) and the time evolution is always unitary, so no information is created or destroyed.
 
  • #109
Ian J Miller said:
Think of the cat paradox. You can assert that the cat is in a superposition of quantum states, but you cannot know that.

There are a number of ways of resolving the so called Schrodinger's Cat paradox, but I think the simplest is to realize that because a cat is a macro object interacting with it's environment it has for all practical purposes an exact position. If you consider the cat to be made up of a large number of small parts - not so small they are quantum but large enough they can be considered classical then they to have an exact position. Now consider the constituent parts of a live and dead cat - they have entirely different positions eg the live cat has a beating heart, expanding lungs etc the dead cat just sits there - dead. Sine they have for all practical purposes exact positions of those small parts they cannot be in a superposition. In other words you can't have eg superposition of a live and dead cat. There is another argument based on the fact the cat is entangled with the atomic source. Now if you chug through the math of entanglement you find the cat acts as it it. With a certain probability, is alive or dead - its what is called a mixed state without going into what that exactly is. But it is not, and never is in a superposition.

Thanks
Bill
 
  • #110
stevendaryl said:
. What they are saying is that there is information in the initial state of a system that is lost when you make a measurement. If you have an electron that is in a superposition ##\alpha |u\rangle + \beta |d\rangle##, there is information in the coefficients ##\alpha## and ##\beta## which is (apparently) lost forever if you measure the spin.

My interpretation of that example is that the "information" being discussed depends on how complicated the state of a system is. So a state whose description needs two complex numbers has more "information" than state that needs only one bit to describe it.

So if Nature allows a system to transform from a complicated state to a simpler state, then Nature has lost information -with respect to that particular system.

However, is there some quantitative definition of "information" that implements this concept? Does a state whose description requires four complex numbers have twice the information as a state whose description requires only two complex numbers? Must two distinct descriptions of the state of the same physical system, have the same amount of information?
 
  • #111
Stephen Tashi said:
However, is there some quantitative definition of "information" that implements this concept? Does a state whose description requires four complex numbers have twice the information as a state whose description requires only two complex numbers? Must two distinct descriptions of the state of the same physical system, have the same amount of information?

Well, there's the Shannon definition of information, which is the number of bits necessary to specify a parameter. A real (or complex) number has an infinite amount of information, while spin-up/spin-down has 1 bit of information.
 
  • Like
Likes bhobba
  • #112
H. Dieter Zeh in "Roots and Fruits of Decoherence" (https://arxiv.org/abs/quant-ph/0512078):

"The collapse of the wave function (without observing the outcome) or any other indeterministic process would represent a dynamical information loss, since a pure state is transformed into an ensemble of possible states (described by a proper mixture, for example). The dislocalization of quantum mechanical superpositions, on the other hand, leads to an apparent information loss, since the relevant phase relations merely become irrelevant for all practical purposes of local observers."
 
  • Like
Likes bhobba
  • #113
PeterDonis said:
This assumes that only one result happens, but that is interpretation dependent. In the many worlds interpretation, all results happen (each result for the measured system is correlated with the corresponding state of the measuring device) and the time evolution is always unitary, so no information is created or destroyed.
No.
If you start out knowing that you have a collection of possible states, and then the result is that collection, then the collection as a whole does not represent either and information loss or gain.
But each individual state has more information than what was started with. Each "world" will have "which world" information.
In my murderer example, we can designate the SSN of the murderer as 012-34-?, which is the same information as {012-34-0000, 012-34-0001, ... 012-34-9999}. Every member of that set taken alone has more information than the whole set. So a cluster of worlds can have less information than anyone of its parts. That extra information comes from the designation or selection of that world's necessary uniqueness - which you get when you are in it.

So the problem you have with MWI, and the reason that it is not fully an "interpretation", is that it requires an exponentially large number of unique universes as time progresses. Say we start out with a universe at time zero with only one bit of information, say a "1". And let's say that every "collapse" (or whatever), splits the universe in two. So at the end of the first QM cycle, we have universe 10 and 11. Now let's say that after every cycle, each bit in the universe meets up with another decision and "splits". So at the end of the second cycle "10" has split into "1000", "1001", "1010", and "1011" and "11" has split in a similar fashion. Given the initial "1", we need 3 bits in each universe if they are to be unique universes.
After the third cycle, that jumps to 7 bits; 4th: 15 bits; 5th: 31 bits.

So how long is a cycle? As long as it takes for a collapse. Many per second. How many seconds can go by before a universe of our size is unable to hold the information? In no time, we would have all possible instances of a universe of our size.
 
  • #114
.Scott said:
If you start out knowing that you have a collection of possible states, and then the result is that collection

This is irrelevant to the MWI since the MWI does not say this is what's happening. The MWI says that the state of the entire universe is a pure state with a unitary time evolution. There is no "collection of possible states"; there is just one state.

The problem with even talking about the MWI is that ordinary words don't have their usual referents, so it's very easy to get confused about what you're actually saying. For example: what is the referent of the word "you" in the sentence quoted above, according to the MWI?
 
  • Like
Likes bhobba
  • #115
stevendaryl said:
Well, there's the Shannon definition of information, which is the number of bits necessary to specify a parameter.

Is that the Shannon definition of "information"? I thought the Shannon definition of information required a probability model. We'd need some concept of a probability distribution for the possibles values of ##\alpha,\beta##. Bits can be used as measure of Shannon information with some assumptions about the behavior of a communication channel.
A real (or complex) number has an infinite amount of information, while spin-up/spin-down has 1 bit of information.

I understand the general idea, but technically the measure of information isn't given by the number of bits required to represent a number unless we are in a a specific communication scenario. For example, if the only possible messages are ##\pi, 2\pi, 3\pi## then the message ##\pi## doesn't contain an infinite amount of Shannon information. If we assume a scenario where a message can be any real number in an infinite set of real numbers, then I agree that representing a specific message requires an infinite number of bits.

If we are counting bits to measure information then how is it that a mixture of states may contain less information that a superposition of states? Don't we end up claiming that one infinity is larger than another infinity?
 
  • #116
Do you think that this experiment could solve the measurement problem:" One of the main problems with solving the measurement problem, is the problem of quantum decoherence. This is a problem because it makes it difficult to distinguish between whether it was the decoherence effect or the act of measurement which caused the wave nature of particles to disapear, in experiments such as the quantum eraser experiment. However an experiment has been concieved which distinguishes between the two, in order to determine the cause of wave function collapse. It does so by controlling the process of decoherence(as best as possible), and then observing the wave nature of the decohered system, by virtue of diffraction, and then carrying out a measurement to see whether the wave nature disapears or not. The exact experiment is a modification of the davisson germer experiment. At the start of the experiment, there will be a vacuum chamber containing a single proton, and an electron gun which will fire electrons slowly into the system in order to decohere the proton. It will be bombarded with about 5 electrons in order to decohere it, and once it has decohered, an anode shall be switched on with a hole in the middle of it and the whole object shall be fired towards a nickel plate, which leads to scattering in various directions. The nickel target can also be rotated, in which electrons can be deflected towards a detector on a mounted arc which could be rotated in a circular motion. The detector, which would be used during the experiment is a faraday cup. When the particle touches the nickle plate in order to test whether measurement causes collapse, the location of the proton shall be measured. There will be two groups. The first group will not have the location of the proton measured on contact of the nickle plate, whereas the second group will have it's location measured on contact with the nickle plate, by virtue of a detector. Because the location of the proton has been measured in group 2, it could affect the scattering of the decohered particles, because the wave function has collapsed for that individual particle(it would be different to those not measured), and so the measurement problem could be solved by being able to see whether the act of measurement has any affect on the scattering of the decohered particles, and distinguish between whether it was decoherence which caused it to behave classically because it has already decohered and therefore the experiment would be testing the causality of measurement on wave function collapse because we are able to measure the wave nature of the decohered system and so any change upon measurement would be down to the act of measurement not decoherence because it is being measured via the diffraction of the particles. "
 
  • #117
Stephen Tashi said:
I understand the general idea, but technically the measure of information isn't given by the number of bits required to represent a number unless we are in a a specific communication scenario. For example, if the only possible messages are ##\pi, 2\pi, 3\pi## then the message ##\pi## doesn't contain an infinite amount of Shannon information. If we assume a scenario where a message can be any real number in an infinite set of real numbers, then I agree that representing a specific message requires an infinite number of bits.

Yes, you're right. Counting bits only gives an upper bound to the information content. But quantum mechanics certainly doesn't place any restrictions on the values of the coefficients of a superposition.
 
  • Like
Likes bhobba
  • #118
Stephen Tashi said:
If we are counting bits to measure information then how is it that a mixture of states may contain less information that a superposition of states? Don't we end up claiming that one infinity is larger than another infinity?

I'm thinking that exploring the subject of information loss might require a lot of work.

However, I don't think that mixed states count as information in the way you are talking about them.

If I check to see if an electron is spin-up, and then later I forget what the answer was, I can describe things using the mixed state:

##\rho = \frac{1}{2} |u\rangle \langle u| + \frac{1}{2} |d\rangle \langle d|##

However we compute information, there's got to be less information in such a mixed state than there is in the pure state spin-up.

I realize that I'm being a little inconsistent, if I consider the amplitude of a pure state to be information, but I don't consider the coefficients of a mixed state. I guess that betrays an interpretation bias on my part: I'm assuming that a mixed state reflects subjective uncertainty, while a pure state is objective. I guess you could consider amplitudes to be subjective, although it's harder for me to see how two people could both assign a pure state to the same particle, but assign different pure states. Maybe someone could come up with a scenario for that?

In contrast, it's easy to come up with scenarios in which people assign different mixed states to the same particle. So it seems more subjective.
 
  • #119
PeterDonis said:
.Scott said:
No.
If you start out knowing that you have a collection of possible states, and then the result is that collection, then the collection as a whole does not represent either [an] information loss or gain.
This is irrelevant to the MWI since the MWI does not say this is what's happening. The MWI says that the state of the entire universe is a pure state with a unitary time evolution. There is no "collection of possible states"; there is just one state.
I didn't say it was. I was just laying the groundwork for what will and will not result in an increase in information.
As long as you have a wave function that describes the possible measurement results, you have not increased the amount of information. As soon as you allow that the actual (unpredictable) result is arbitrary, then you (the Phycisist or investigator) have created a model that involves the synthesis of information when "measurements" are made. For MWI, this makes it difficult to avoid a steady increase in the information content of the universe. The only way for information to be reduced would be for most of the "splits" to result in identical results - effectively causing "joins".

The counter to MWI is a model which says that the full result of a measurement is contained in a single universe.
 
  • #120
.Scott said:
As soon as you allow that the actual (unpredictable) result is arbitrary,

Which, in the MWI, is not the case. In the MWI, there is no unpredictability. The time evolution is always unitary. All of the measurement results happen; each one is appropriately correlated with the appropriate state of the measuring device. All of this is unitary and does not create or destroy any information.

.Scott said:
For MWI, this makes it difficult to avoid a steady increase in the information content of the universe.

It does no such thing. See above.

.Scott said:
The only way for information to be reduced would be for most of the "splits" to result in identical results - effectively causing "joins".

There are no "splits" in the MWI in the sense you mean. There is only one wave function and its time evolution is unitary.

.Scott said:
The counter to MWI is a model which says that the full result of a measurement is contained in a single universe.

I'm not sure what you mean by this, but it sounds like you are describing the actual MWI, not any "counter" to it.
 
  • Like
Likes bhobba

Similar threads

Replies
16
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K