Is there an interpretation independent outcome problem?

Click For Summary

Discussion Overview

The discussion revolves around the interpretation of outcomes in quantum mechanics, particularly in relation to decoherence and the measurement problem. Participants explore whether decoherence resolves the preferred basis problem and how it relates to the emergence of classical outcomes from quantum superpositions. The conversation touches on various interpretations of quantum mechanics, including Bayesian and Everett interpretations, and the nature of mixed states.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Exploratory

Main Points Raised

  • Some participants argue that decoherence explains the suppression of interference phenomena and the appearance of classical outcomes, yet question why outcomes occur at all.
  • Others challenge the notion that an observer perceives a superposition as a mixed state, citing examples like the double slit experiment.
  • A participant references a quote from 'Quantum Enigma' discussing how decoherence leads to classical-like probabilities, but emphasizes these are not true classical probabilities.
  • There is a discussion about the distinction between proper and improper mixed states, with some participants questioning the implications of measurement and the nature of reality post-measurement.
  • Some participants express uncertainty about the definitions of proper and improper mixed states and how they relate to decoherence and measurement outcomes.
  • The Bayesian interpretation is mentioned as suggesting that quantum mechanics provides statistical predictions without certainty about specific outcomes.
  • The Everett interpretation is presented as proposing that measurement results in entangled states, leading to distinct outcomes for observers, though the overall state remains mixed.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the interpretation of outcomes in quantum mechanics. Multiple competing views are presented regarding the nature of mixed states, the role of decoherence, and the implications of different interpretations of quantum mechanics.

Contextual Notes

Participants express varying definitions of proper and improper mixed states, and there is ambiguity regarding the concept of "purification" of states through measurement. The discussion highlights unresolved questions about the ontological status of quantum states and the implications of decoherence.

Derek Potter
Messages
509
Reaction score
37
I have it on good authority that decoherence (probably) solves the preferred basis problem and explains the suppression of interference phenomena; however there remains the question: why do we get outcomes at all?

Unitary evolution of coupled systems results in entanglement and thus an improper mixed state. So, I would argue, when an observer looks at a superposition it looks like a mixed state: a probability distribution of different outcomes.

What then is left to explain? Or is there something wrong with this argument?

Please, please, please, don't tell me that QM is required to explain how a truly classical world emerges! QM is notoriously about observations and if decoherence accounts for observations that look like classical ones, then surely the job is done! It is nonsense to move the goalposts when the game is over.

But maybe there is something to explain which is staring me in the face but which I can't see?
 
Last edited:
Physics news on Phys.org
Derek Potter said:
So, I would argue, when an observer looks at a superposition it looks like a mixed state: a probability distribution of different outcomes.

That's demonstrably false eg the double slit. What the back screen 'observes' is a superposition of the state behind each slit.

Thanks
Bill
 
Last edited:
This is a good extract from the book 'Quantum Enigma' (2nd edition) on decoherence - page 209:
...that the photon pass through our boxes and then encounter the macroscopic environment. Assuming thermal randomness, one can calculate the extremely short time after which interference experiment becomes impossible, for all practical purposes. ... Averaging over the decohered wave-functions of the atoms leaves us with an equation for a classical-like probability for each atom actually existing wholly in one of the other boxes of its pairs. ... Those classical-like probabilities are still probabilities of what will be observed. They are not true classical probabilities of something that actually exists."
 
Derek Potter said:
I have it on good authority that decoherence (probably) solves the preferred basis problem and explains the suppression of interference phenomena; however there remains the question: why do we get outcomes at all?

The only resolutions of this sort of quantum measurement problem (that I understand and am familiar with) come in two interpretations:

In the Bayesian interpretation, we simply don't know why we get one outcome and not another, but quantum mechanics tells us what our best guesses should be without any true certainty one way or the other.

In the Everett (many worlds) interpretation, the action of an observer during measurement is an interaction just like between any other pair of physical systems.
As a result, the quantum state of the observer+system becomes entangled, and the eigenstates of measurement outcomes become correlated to distinct quantum states of the observer. Individually, the state of the system is in a mixed state, and so is the state of the observer, but together their joint state is a pure entangled state (assuming their states were pure to begin with).
In this interpretation, the observer is a physical system with a memory that records the outcomes of its interactions with other physical systems. Upon examining the observer, we only ever see one string of outcomes, even though that observer might be in a mixed state of all possible strings (due to its entanglement with other systems).

That being said, I don't think there's any real scientific consensus on the subject.
 
StevieTNZ said:
This is a good extract from the book 'Quantum Enigma' (2nd edition) on decoherence - page 209:
"...Averaging over the decohered wave-functions of the atoms leaves us with an equation for a classical-like probability for each atom actually existing wholly in one of the other boxes of its pairs. ... Those classical-like probabilities are still probabilities of what will be observed. They are not true classical probabilities of something that actually exists."

Reference https://www.physicsforums.com/threads/is-there-an-interpretation-independent-outcome-problem.825657/

I'm just the idiot in the room, so please have pity on me. But, Stevie's quote from Rosenblum and Kuttner's book seems to address my primary question about this, which is... Does the improper mixture ONLY make a statistical prediction of outcome? Is there an underlying reality that actually EXISTS after the mixed state is "purified" by measurement? Unless that question can be answered, I don't understand how the improper mixed state can be differentiated substantively from the proper mixed state based solely on the formalism of QT. Can anyone explain that to me?
 
Feeble Wonk said:
I'm just the idiot in the room, so please have pity on me. But, Stevie's quote from Rosenblum and Kuttner's book seems to address my primary question about this, which is... Does the improper mixture ONLY make a statistical prediction of outcome? Is there an underlying reality that actually EXISTS after the mixed state is "purified" by measurement? Unless that question can be answered, I don't understand how the improper mixed state can be differentiated substantively from the proper mixed state based solely on the formalism of QT. Can anyone explain that to me?
I think there is floating on this forum two different definitions of what a proper and an improper mixed state means.

What do you mean by purified? If you mean, 'make a measurement on a 'mixed state'*', then there would be only one outcome -- but when does that one outcome occur? Apparatus? Apparatus measuring the first apparatus? On the more extreme side, consciousness? (the famous measurement problem). Before then it is in a superposition.

*I use quotation marks around mixed state as it is still in a pure, superposition, state, even after decoherence.

QM only makes statistical predictions of outcomes.
 
StevieTNZ said:
I think there is floating on this forum two different definitions of what a proper and an improper mixed state means.

What do you mean by purified? If you mean, 'make a measurement on a 'mixed state'*', then there would be only one outcome -- but when does that one outcome occur? Apparatus? Apparatus measuring the first apparatus? On the more extreme side, consciousness? (the famous measurement problem). Before then it is in a superposition.

*I use quotation marks around mixed state as it is still in a pure, superposition, state, even after decoherence.

QM only makes statistical predictions of outcomes.

I probably used the term "purified" incorrectly. I guess my real question is regarding the concept of an "improper" mixed state.

The way it's been explained to me thus far (or at least my take away understanding) is that a "proper" mixed state is an "unknown" quantum state that is "either/or", as opposed to a "pure" state that is unresolved (still "and", so to speak), which is in true superposition. Assuming that I've got at least that much straight, my confusion still applies to an "improper" mixed state, which I've been told is secondary to environmental decoherence. If I understood it correctly, decoherence statistically predicts/defines (presumably by logical limitation) the outcome of environmental interaction with the quantum system, even in the absence of (or prior to) true quantum "collapse" (if such a thing actually occurs).

The argument then goes further, suggesting that because the proper and improper states are mathematically indiscernible, they are, in fact, the same thing.

That would seem reasonable to me IF the information is all that is "real", but not if the wave function is, or represents something, that is ontologically extant. Does that make any sense at all?
 
Last edited:
  • Like
Likes   Reactions: Derek Potter
jfizzix said:
The only resolutions of this sort of quantum measurement problem (that I understand and am familiar with) come in two interpretations:
In the Bayesian interpretation, we simply don't know why we get one outcome and not another, but quantum mechanics tells us what our best guesses should be without any true certainty one way or the other.
In the Everett (many worlds) interpretation, the action of an observer during measurement is an interaction just like between any other pair of physical systems.
As a result, the quantum state of the observer+system becomes entangled, and the eigenstates of measurement outcomes become correlated to distinct quantum states of the observer. Individually, the state of the system is in a mixed state, and so is the state of the observer, but together their joint state is a pure entangled state (assuming their states were pure to begin with).
In this interpretation, the observer is a physical system with a memory that records the outcomes of its interactions with other physical systems. Upon examining the observer, we only ever see one string of outcomes, even though that observer might be in a mixed state of all possible strings (due to its entanglement with other systems).
That being said, I don't think there's any real scientific consensus on the subject.

I see no difference. QM defines probabilities on a state which is subject to unitary evolution. That's the Bayesian side taken care of. The unitary evolution provides the improper mixed state.which, as you point out can be nested - observer of observer of observer - because in fact the nesting is a verbal artifact due to giving the observer a special role, whereas with unitary evolution leading to mixed states, each observer is simply one of many subsystems that are entangled.

My question isn't really about any of that. I'm asking why people say there is an unsolved problem of explaining why there are any outcomes at all in a decoherence-based theory of observation.
 
The clear answer to the question in the thread title is "no". I do not agree with how Schlosshauer or bhobba state the measurement problem, however, that is not a big deal here. The answer is clearly "no", because if the interpretation solves the measurement problem, then there is no measurement problem in that interpretation.
 
  • #10
Feeble Wonk said:
I'm just the idiot in the room, so please have pity on me. But, Stevie's quote from Rosenblum and Kuttner's book seems to address my primary question about this, which is... Does the improper mixture ONLY make a statistical prediction of outcome? Is there an underlying reality that actually EXISTS after the mixed state is "purified" by measurement? Unless that question can be answered, I don't understand how the improper mixed state can be differentiated substantively from the proper mixed state based solely on the formalism of QT. Can anyone explain that to me?

If you maintain that QM is a theory about observation then of course you are right, there is no need to worry about what the improper mixture really is. However standard formulations of QM typically mention three things that are reasonably thought of as part of reality: observations, the system itself and the system state. Plus of course the general rules that govern their behaviour. One may reasonably assume that the system exists and most people, unless they have spent too much time on Physics Forums, will assume that the system state is real whether they incline to believe in a separate entity called the wavefunction or not.

So from an observations-only PoV, there is no observational difference between a proper and an improper mixed state. But if, after examining QM with a magnifying glass, you can't see where the formalism forbids you to consider the wavefunction to be ontic then you will suddenly find yourself in fresh air and able to ask questions that some would say are meaningless or metaphysical or even philosophical. Questions like "what is going on when an interference pattern is created?" I'll say this: the mind-set that insists that meaningful questions must be answerable by pure logic and experiment is a deeply inflexible bit of philosophical dogma. Officially it should have no place on this forum.

"The second motivation for an ensemble interpretation is the intuition that because quantum mechanics is inherently probabilistic, it only needs to make sense as a theory of ensembles. Whether or not probabilities can be given a sensible meaning for individual systems, this motivation is not compelling. For a theory ought to be able to describe as well as predict the behavior of the world. The fact that physics cannot make deterministic predictions about individual systems does not excuse us from pursuing the goal of being able to describe them as they currently are." - David Mermin

But I would request that we stay clear of these issues as my question was quite specific and I would like to know the answer.
 
  • #11
StevieTNZ said:
I think there is floating on this forum two different definitions of what a proper and an improper mixed state means.
Well they are two different things so I suppose it would be reasonable to have two definitions :) but I suppose you probably mean two definitions for each one? If it is relevant to my question please say what they are as the terms seem pretty unambiguous to me.
 
  • #12
atyy said:
The answer is clearly "no", because if the interpretation solves the measurement problem, then there is no measurement problem in that interpretation.
You appear to be saying that there are interpretations in which the measurement problem is solved. The sources I have say that this is not the case, that the outcome part of the measurement problem is not solved by decoherence. I want to know why it is not as it seems trivial to me, given improper mixed states, but there again, my mind seems to work differently from that of proper physicists.
 
  • #13
Derek Potter said:
Well they are two different things so I suppose it would be reasonable to have two definitions :) but I suppose you probably mean two definitions for each one? If it is relevant to my question please say what they are as the terms seem pretty unambiguous to me.
yes, two definitions for each one.
 
  • #14
Derek Potter said:
Yes. No definitions let alone two per type of mixture.
From the phrases I was hoping you would take away the following points:
  1. Decoherence causes interference to be suppressed, resulting in what looks like classical probabilities about something that exists, e.g. 50% of getting tails or heads when flipping a coin, with tails and heads actually existing on the coin (whether this is called a proper or improper mixture, I await Bernard's email)
  2. However those apparent classical probabilities actually still mean the system is still in a superposition, e.g. 50% of getting tails or heads from flipping a coin where heads and tails don't exist on the coin before measurement.
Regarding the two definitions of proper and improper mixture, consider them the same two definitions for both, yet to be clarified which one belongs to which.
 
  • #15
StevieTNZ said:
I have just emailed Bernard d'Espagnat asking him to clarify the difference between proper and improper mixtures. I am sure when I check my emails tomorrow morning (or even later this evening) there will be an email from him.
Well that will be interesting as I have certainly heard of his idea that there are no proper mixed states under unitary evolution (which seems pretty obvious to me) and I am aware thet there is a controversial rebuttal but I haven't the faintest idea what it is about.

Seems mildly amusing that bhobba's Ignorance Interpretation (not quite the same as Ballentine's, I gather) leads to the opposite conclusion: that all mixtures are proper. But why not? Switch the onticity from the state vector to the reduced density matrix and back and it's hardly surprising that the nature of a mixture changes!
 
  • #16
StevieTNZ said:
From the phrases I was hoping you would take away the following points:
  1. Decoherence causes interference to be suppressed, resulting in what looks like classical probabilities about something that exists, e.g. 50% of getting tails or heads when flipping a coin, with tails and heads actually existing on the coin (whether this is called a proper or improper mixture, I await Bernard's email)
  2. However those apparent classical probabilities actually still mean the system is still in a superposition, e.g. 50% of getting tails or heads from flipping a coin where heads and tails don't exist on the coin before measurement.
[Mentor's note: Edited to remove gratuitous personal attacks]

Now, there is a substantial point in #2 above and maybe d'Espagnat will touch upon it. My maths is very dicky so feel free to bin this suggestion, but I believe it is possible to "turn the superposition into a density matrix" in two ways. The first would explicitly include the decohering subsystem (e.g. the environment), making the description "still in a superposition" absolutely correct for the combined system. The second would ignore the state of the decohering subsystem, it is simply unknown and unknowable. Thus the state of the system of interest (e.g. the cat), decoheres and one is left seeing it as a mixed state with no useful distinction between proper and improper.
 
Last edited by a moderator:
  • #17
StevieTNZ said:
From the phrases I was hoping you would take away the following points:
  1. Decoherence causes interference to be suppressed, resulting in what looks like classical probabilities about something that exists, e.g. 50% of getting tails or heads when flipping a coin, with tails and heads actually existing on the coin (whether this is called a proper or improper mixture, I await Bernard's email)
  2. However those apparent classical probabilities actually still mean the system is still in a superposition, e.g. 50% of getting tails or heads from flipping a coin where heads and tails don't exist on the coin before measurement.
Regarding the two definitions of proper and improper mixture, consider them the same two definitions for both, yet to be clarified which one belongs to which.

Coin tossing fails to be either because it is classical and can only be quantumized by postulating a particular preparation - this will determine whether the coin inherits superposition from a quantum event such as radioactive decay, or whether it inherits a definite but unknown state from some other proper mixture created by something like wavefunction collapse.
 
  • #18
Derek Potter said:
Coin tossing fails to be either because it is classical and can only be quantumized by postulating a particular preparation - this will determine whether the coin inherits superposition from a quantum event such as radioactive decay, or whether it inherits a definite but unknown state from some other proper mixture created by something like wavefunction collapse.
In principle QM applies to all systems, micro and macro. Therefore the coin is not classical, it is quantum.
 
  • Like
Likes   Reactions: Nugatory
  • #19
bhobba said:
That's demonstrably false eg the double slit. What the back screen 'observes' is a superposition of the state behind each slit.

I do not think that speaks to Derek Potter's question. Consider the case in which we build up the interference pattern one particle at a time: We end up with a photographic plate with an interesting pattern of exposed and unexposed photosensitive granules on its surface. The granules are small, but they're still pretty clearly classical, so this is a purely classical object, no superposition at all. Sure, the incoming particles were in superposition, but that superposition collapsed when they were effectively subjected to a position measurement by the interactionwith a particular granule on the surface of the plate.

So we're still confronted with the measurement problem: How did we get from unitary evolution of the (superimposed) wave function of the incoming particles to this particular combination of exposed and unexposed photosensitive granules on the surface of the plate?
 
  • #20
Derek Potter said:
Unitary evolution of coupled systems results in entanglement and thus an improper mixed state. So, I would argue, when an observer looks at a superposition it looks like a mixed state: a probability distribution of different outcomes.

What then is left to explain? Or is there something wrong with this argument?
...
But maybe there is something to explain which is staring me in the face but which I can't see?

Consider Schrödinger's cat. Decoherence tells us that the wave function of the cat+detector+nucleus will very quickly evolve into a form that has negligible interference between "live cat" and "dead cat". After decoherence we just have a simple probabilistic statement based on incomplete information (cat is alive with probability ##x##, cat is dead with probability ##y##, ##x+y=1##, the only reason we don't know which is that we haven't looked yet) no different than the statement that we'd make about a tossed coin. As with the tossed coin, the density matrix for the two outcomes is diagonal and we have a probability distribution of outcomes, both of which are classical.

But there is still something unexplained here. Why do we get one result, either "live cat" or "dead cat"? How and when is the actual outcome selected? In the case of the tossed coin the mixed state reflects our ignorance of the complete initial state of the coin and its interaction with the floor; with a complete specification of the initial system state classical physics would lead to a deterministic prediction of the coin's final state. The same is not true of the evolution of the quantum mechanical state - there's nothing in the evolution of the wave function through its interactions and entanglements with the apparatus that ever turns the probability distribution into a single sharply defined outcome. That's the problem that decoherence does not address in any interpretation.
 
Last edited:
  • Like
Likes   Reactions: bhobba
  • #21
A bunch of unnecessary personal argumentation has been removed from this thread. I'm leaving it open, but not without some trepidation... Please don't make me regret this... Please?

(If I accidentally tossed out something good in the cleanup, let me know by PM.)
 
  • #22
Nugatory said:
That's the problem that decoherence does not address in any interpretation.

Indeed. Its the problem of outcomes discussed at length by Schlosshauer:
https://www.amazon.com/dp/3540357734/?tag=pfamazon01-20

That's why I interpret it as a proper mixed state - but how it becomes one - blankout.

Thanks
Bill
 
  • #23
Derek Potter said:
You appear to be saying that there are interpretations in which the measurement problem is solved. The sources I have say that this is not the case, that the outcome part of the measurement problem is not solved by decoherence. I want to know why it is not as it seems trivial to me, given improper mixed states, but there again, my mind seems to work differently from that of proper physicists.

At depends on what one means by solved. In the realm of non-relativistic quantum mechanics, Bohmian Mechanics is widely thought to solve the measurement problem. However, it remains unclear whether the solution can be extended to all of quantum mechanics.

At any rate, the question itself assumes that there is more than one interpretation. If that is not the case, there is only one interpretation - Copenhagen - and given that Copenhagen has preferred status amongst all interpretations, then one could say that formulating the measurement problem relative to Copenhagen is interpretation-independent.
 
  • #24
Nugatory said:
Consider Schrödinger's cat. Decoherence tells us that the wave function of the cat+detector+nucleus will very quickly evolve into a form that has negligible interference between "live cat" and "dead cat". After decoherence we just have a simple probabilistic statement based on incomplete information (cat is alive with probability ##x##, cat is dead with probability ##y##, ##x+y=1##, the only reason we don't know which is that we haven't looked yet) no different than the statement that we'd make about a tossed coin. As with the tossed coin, the density matrix for the two outcomes is diagonal and we have a probability distribution of outcomes, both of which are classical.

But there is still something unexplained here. Why do we get one result, either "live cat" or "dead cat"? How and when is the actual outcome selected? In the case of the tossed coin the mixed state reflects our ignorance of the complete initial state of the coin and its interaction with the floor; with a complete specification of the initial system state classical physics would lead to a deterministic prediction of the coin's final state. The same is not true of the evolution of the quantum mechanical state - there's nothing in the evolution of the wave function through its interactions and entanglements with the apparatus that ever turns the probability distribution into a single sharply defined outcome. That's the problem that decoherence does not address in any interpretation.

This is very close to Schlosshauer's description of the measurement problem. However, I don't quite agree with it. If decoherence is sufficient to tell us where and when collapse occurs, then one can simply postulate that there is an outcome. That is not a problem. It would be like saying the Bohmian Mechanics does not solve the measurement problem because it does not explain why there are hidden variables.

I prefer a formulation in which a classical or macrosocpic "observer" or "measurement apparatus" is not required to be postulated on as fundamental, and not emergent from the "microscopic" or "quantum" realm.
 
  • #25
atyy said:
then one can simply postulate that there is an outcome. That is not a problem.

That's my personal favorite resolution, as it's the only one that allows me to stop worrying, be happy, and actually get some work done :smile:

It's also consistent with the position (in response to Derek Potter's original question) that decoherence alone does not resolve the problem.

Whether "one can simply postulate that..." is a satisfactory resolution depends, of course, on what one finds satisfactory and whether anyone has a better answer.
 
  • #26
Nugatory said:
I do not think that speaks to Derek Potter's question. Consider the case in which we build up the interference pattern one particle at a time: We end up with a photographic plate with an interesting pattern of exposed and unexposed photosensitive granules on its surface. The granules are small, but they're still pretty clearly classical, so this is a purely classical object, no superposition at all. Sure, the incoming particles were in superposition, but that superposition collapsed when they were effectively subjected to a position measurement by the interactionwith a particular granule on the surface of the plate.

So we're still confronted with the measurement problem: How did we get from unitary evolution of the (superimposed) wave function of the incoming particles to this particular combination of exposed and unexposed photosensitive granules on the surface of the plate?
If that is the "outcome problem" then it has a very simple solution as I am sure you know. There is no definite outcome, but the system of screen+observer (or apparatus)+environment enters a superposition of states with the photon absorbed at different places. Decoherence then allows us to regard the screen as being in a mixed state - an improper one of course.
 
  • #27
atyy said:
This is very close to Schlosshauer's description of the measurement problem. However, I don't quite agree with it. If decoherence is sufficient to tell us where and when collapse occurs, then one can simply postulate that there is an outcome. That is not a problem. It would be like saying the Bohmian Mechanics does not solve the measurement problem because it does not explain why there are hidden variables.

I prefer a formulation in which a classical or macrosocpic "observer" or "measurement apparatus" is not required to be postulated on as fundamental, and not emergent from the "microscopic" or "quantum" realm.
Why postulate it at all?. If the projection postulate is framed in terms of definite outcomes then it is simply in contradiction to the vector state postulate. Why not postulate in terms of observed outcomes? The calculating recipe remains the same. The predictions remain the same. But an opening is created for interpreting the observations as the appearence of definite outcomes not actual definite ones.
 
Last edited:
  • #28
Derek Potter said:
Why postulate it at all. If the projection postulate is frames in terms of definite outcomes then it is simply in contradiction to the vector state postulate. Why not postulate in terms of observed outcomes? The calculating recipe remains the same. The predictions remain the same. But an opening is created for interpreting the observations as the appearence of definite outcomes not actual definite ones.

That remains debatable. Here you are assuming the correctness of Many-Worlds. However, there is no consensus that any form of Many-Worlds is correct, except Bohmian Mechanics.
 
  • Like
Likes   Reactions: bhobba
  • #29
I don't understand. Where is the assumption?
 
Last edited:
  • #30
Derek Potter said:
I don't understand. Where is the assumption?

Everett or relative state or whatever you wish to call it.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
1K
  • · Replies 139 ·
5
Replies
139
Views
3K
  • · Replies 35 ·
2
Replies
35
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 120 ·
5
Replies
120
Views
10K
  • · Replies 21 ·
Replies
21
Views
4K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 47 ·
2
Replies
47
Views
6K
  • · Replies 9 ·
Replies
9
Views
2K