Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is there an interpretation independent outcome problem?

  1. Jul 31, 2015 #1
    I have it on good authority that decoherence (probably) solves the preferred basis problem and explains the suppression of interference phenomena; however there remains the question: why do we get outcomes at all?

    Unitary evolution of coupled systems results in entanglement and thus an improper mixed state. So, I would argue, when an observer looks at a superposition it looks like a mixed state: a probability distribution of different outcomes.

    What then is left to explain? Or is there something wrong with this argument?

    Please, please, please, don't tell me that QM is required to explain how a truly classical world emerges! QM is notoriously about observations and if decoherence accounts for observations that look like classical ones, then surely the job is done! It is nonsense to move the goalposts when the game is over.

    But maybe there is something to explain which is staring me in the face but which I can't see?
    Last edited: Jul 31, 2015
  2. jcsd
  3. Jul 31, 2015 #2


    Staff: Mentor

    That's demonstrably false eg the double slit. What the back screen 'observes' is a superposition of the state behind each slit.

    Last edited: Jul 31, 2015
  4. Jul 31, 2015 #3
    This is a good extract from the book 'Quantum Enigma' (2nd edition) on decoherence - page 209:
  5. Jul 31, 2015 #4


    User Avatar
    Science Advisor
    Gold Member

    The only resolutions of this sort of quantum measurement problem (that I understand and am familiar with) come in two interpretations:

    In the Bayesian interpretation, we simply don't know why we get one outcome and not another, but quantum mechanics tells us what our best guesses should be without any true certainty one way or the other.

    In the Everett (many worlds) interpretation, the action of an observer during measurement is an interaction just like between any other pair of physical systems.
    As a result, the quantum state of the observer+system becomes entangled, and the eigenstates of measurement outcomes become correlated to distinct quantum states of the observer. Individually, the state of the system is in a mixed state, and so is the state of the observer, but together their joint state is a pure entangled state (assuming their states were pure to begin with).
    In this interpretation, the observer is a physical system with a memory that records the outcomes of its interactions with other physical systems. Upon examining the observer, we only ever see one string of outcomes, even though that observer might be in a mixed state of all possible strings (due to its entanglement with other systems).

    That being said, I don't think there's any real scientific consensus on the subject.
  6. Jul 31, 2015 #5
    I'm just the idiot in the room, so please have pity on me. But, Stevie's quote from Rosenblum and Kuttner's book seems to address my primary question about this, which is... Does the improper mixture ONLY make a statistical prediction of outcome? Is there an underlying reality that actually EXISTS after the mixed state is "purified" by measurement? Unless that question can be answered, I don't understand how the improper mixed state can be differentiated substantively from the proper mixed state based solely on the formalism of QT. Can anyone explain that to me?
  7. Jul 31, 2015 #6
    I think there is floating on this forum two different definitions of what a proper and an improper mixed state means.

    What do you mean by purified? If you mean, 'make a measurement on a 'mixed state'*', then there would be only one outcome -- but when does that one outcome occur? Apparatus? Apparatus measuring the first apparatus? On the more extreme side, consciousness? (the famous measurement problem). Before then it is in a superposition.

    *I use quotation marks around mixed state as it is still in a pure, superposition, state, even after decoherence.

    QM only makes statistical predictions of outcomes.
  8. Jul 31, 2015 #7
    I probably used the term "purified" incorrectly. I guess my real question is regarding the concept of an "improper" mixed state.

    The way it's been explained to me thus far (or at least my take away understanding) is that a "proper" mixed state is an "unknown" quantum state that is "either/or", as opposed to a "pure" state that is unresolved (still "and", so to speak), which is in true superposition. Assuming that I've got at least that much straight, my confusion still applies to an "improper" mixed state, which I've been told is secondary to environmental decoherence. If I understood it correctly, decoherence statistically predicts/defines (presumably by logical limitation) the outcome of environmental interaction with the quantum system, even in the absence of (or prior to) true quantum "collapse" (if such a thing actually occurs).

    The argument then goes further, suggesting that because the proper and improper states are mathematically indiscernible, they are, in fact, the same thing.

    That would seem reasonable to me IF the information is all that is "real", but not if the wave function is, or represents something, that is ontologically extant. Does that make any sense at all?
    Last edited: Jul 31, 2015
  9. Jul 31, 2015 #8
    I see no difference. QM defines probabilities on a state which is subject to unitary evolution. That's the Bayesian side taken care of. The unitary evolution provides the improper mixed state.which, as you point out can be nested - observer of observer of observer - because in fact the nesting is a verbal artifact due to giving the observer a special role, whereas with unitary evolution leading to mixed states, each observer is simply one of many subsystems that are entangled.

    My question isn't really about any of that. I'm asking why people say there is an unsolved problem of explaining why there are any outcomes at all in a decoherence-based theory of observation.
  10. Jul 31, 2015 #9


    User Avatar
    Science Advisor

    The clear answer to the question in the thread title is "no". I do not agree with how Schlosshauer or bhobba state the measurement problem, however, that is not a big deal here. The answer is clearly "no", because if the interpretation solves the measurement problem, then there is no measurement problem in that interpretation.
  11. Jul 31, 2015 #10
    If you maintain that QM is a theory about observation then of course you are right, there is no need to worry about what the improper mixture really is. However standard formulations of QM typically mention three things that are reasonably thought of as part of reality: observations, the system itself and the system state. Plus of course the general rules that govern their behaviour. One may reasonably assume that the system exists and most people, unless they have spent too much time on Physics Forums, will assume that the system state is real whether they incline to believe in a separate entity called the wavefunction or not.

    So from an observations-only PoV, there is no observational difference between a proper and an improper mixed state. But if, after examining QM with a magnifying glass, you can't see where the formalism forbids you to consider the wavefunction to be ontic then you will suddenly find yourself in fresh air and able to ask questions that some would say are meaningless or metaphysical or even philosophical. Questions like "what is going on when an interference pattern is created?" I'll say this: the mind-set that insists that meaningful questions must be answerable by pure logic and experiment is a deeply inflexible bit of philosophical dogma. Officially it should have no place on this forum.

    "The second motivation for an ensemble interpretation is the intuition that because quantum mechanics is inherently probabilistic, it only needs to make sense as a theory of ensembles. Whether or not probabilities can be given a sensible meaning for individual systems, this motivation is not compelling. For a theory ought to be able to describe as well as predict the behavior of the world. The fact that physics cannot make deterministic predictions about individual systems does not excuse us from pursuing the goal of being able to describe them as they currently are." - David Mermin

    But I would request that we stay clear of these issues as my question was quite specific and I would like to know the answer.
  12. Jul 31, 2015 #11
    Well they are two different things so I suppose it would be reasonable to have two definitions :) but I suppose you probably mean two definitions for each one? If it is relevant to my question please say what they are as the terms seem pretty unambiguous to me.
  13. Jul 31, 2015 #12
    You appear to be saying that there are interpretations in which the measurement problem is solved. The sources I have say that this is not the case, that the outcome part of the measurement problem is not solved by decoherence. I want to know why it is not as it seems trivial to me, given improper mixed states, but there again, my mind seems to work differently from that of proper physicists.
  14. Jul 31, 2015 #13
    yes, two definitions for each one.
  15. Jul 31, 2015 #14
    From the phrases I was hoping you would take away the following points:
    1. Decoherence causes interference to be suppressed, resulting in what looks like classical probabilities about something that exists, e.g. 50% of getting tails or heads when flipping a coin, with tails and heads actually existing on the coin (whether this is called a proper or improper mixture, I await Bernard's email)
    2. However those apparent classical probabilities actually still mean the system is still in a superposition, e.g. 50% of getting tails or heads from flipping a coin where heads and tails don't exist on the coin before measurement.
    Regarding the two definitions of proper and improper mixture, consider them the same two definitions for both, yet to be clarified which one belongs to which.
  16. Jul 31, 2015 #15
    Well that will be interesting as I have certainly heard of his idea that there are no proper mixed states under unitary evolution (which seems pretty obvious to me) and I am aware thet there is a controversial rebuttal but I haven't the faintest idea what it is about.

    Seems mildly amusing that bhobba's Ignorance Interpretation (not quite the same as Ballentine's, I gather) leads to the opposite conclusion: that all mixtures are proper. But why not? Switch the onticity from the state vector to the reduced density matrix and back and it's hardly surprising that the nature of a mixture changes!
  17. Jul 31, 2015 #16
    [Mentor's note: Edited to remove gratuitous personal attacks]

    Now, there is a substantial point in #2 above and maybe d'Espagnat will touch upon it. My maths is very dicky so feel free to bin this suggestion, but I believe it is possible to "turn the superposition into a density matrix" in two ways. The first would explicitly include the decohering subsystem (e.g. the environment), making the description "still in a superposition" absolutely correct for the combined system. The second would ignore the state of the decohering subsystem, it is simply unknown and unknowable. Thus the state of the system of interest (e.g. the cat), decoheres and one is left seeing it as a mixed state with no useful distinction between proper and improper.
    Last edited by a moderator: Aug 1, 2015
  18. Jul 31, 2015 #17
    Coin tossing fails to be either because it is classical and can only be quantumized by postulating a particular preparation - this will determine whether the coin inherits superposition from a quantum event such as radioactive decay, or whether it inherits a definite but unknown state from some other proper mixture created by something like wavefunction collapse.
  19. Jul 31, 2015 #18
    In principle QM applies to all systems, micro and macro. Therefore the coin is not classical, it is quantum.
  20. Aug 1, 2015 #19


    User Avatar

    Staff: Mentor

    I do not think that speaks to Derek Potter's question. Consider the case in which we build up the interference pattern one particle at a time: We end up with a photographic plate with an interesting pattern of exposed and unexposed photosensitive granules on its surface. The granules are small, but they're still pretty clearly classical, so this is a purely classical object, no superposition at all. Sure, the incoming particles were in superposition, but that superposition collapsed when they were effectively subjected to a position measurement by the interactionwith a particular granule on the surface of the plate.

    So we're still confronted with the measurement problem: How did we get from unitary evolution of the (superimposed) wave function of the incoming particles to this particular combination of exposed and unexposed photosensitive granules on the surface of the plate?
  21. Aug 1, 2015 #20


    User Avatar

    Staff: Mentor

    Consider Schrodinger's cat. Decoherence tells us that the wave function of the cat+detector+nucleus will very quickly evolve into a form that has negligible interference between "live cat" and "dead cat". After decoherence we just have a simple probabilistic statement based on incomplete information (cat is alive with probability ##x##, cat is dead with probability ##y##, ##x+y=1##, the only reason we don't know which is that we haven't looked yet) no different than the statement that we'd make about a tossed coin. As with the tossed coin, the density matrix for the two outcomes is diagonal and we have a probability distribution of outcomes, both of which are classical.

    But there is still something unexplained here. Why do we get one result, either "live cat" or "dead cat"? How and when is the actual outcome selected? In the case of the tossed coin the mixed state reflects our ignorance of the complete initial state of the coin and its interaction with the floor; with a complete specification of the initial system state classical physics would lead to a deterministic prediction of the coin's final state. The same is not true of the evolution of the quantum mechanical state - there's nothing in the evolution of the wave function through its interactions and entanglements with the apparatus that ever turns the probability distribution into a single sharply defined outcome. That's the problem that decoherence does not address in any interpretation.
    Last edited: Aug 1, 2015
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook