Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Physicist disentangles 'Schrodinger's cat' debate

  1. Aug 26, 2013 #1
    As per the title, the article is Physicist disentangles 'Schrodinger's cat' debate

    The paper is available here

    To be quite honest, I don't quite understand what he is trying to say. It looks as though he's trying to make calculations on entangled pairs by describing each system of the pair as a separate state, to show its not in a superposition, but then goes on the describe the MS as being in a coherent superposition.

    I'm sure once a few people read the article they may be able to clarify what the author is trying to say, and how he derives his conclusion. I'm utterly confused by his approach.
     
    Last edited: Aug 26, 2013
  2. jcsd
  3. Aug 27, 2013 #2
    I think it suggests that solves the measurement problem....

    :surprised

    .
     
  4. Aug 27, 2013 #3
    Another no interpretation interpretation -it happens because it happens. At least there is no contradiction.
     
  5. Aug 27, 2013 #4
    There is no measurement problem if you assume entanglement selects an outcome.
     
  6. Aug 27, 2013 #5
    The author of that paper (Art Hobson) also holds the view that decoherence solves the measurement problem. A rebuttal regarding the later paper was given by Ruth Kastner and was published very recently in ArXiv.
     
    Last edited: Aug 27, 2013
  7. Aug 27, 2013 #6
    The author says this, but is his conclusion correct based on his premises?
     
  8. Aug 27, 2013 #7
    of course not
     
  9. Aug 27, 2013 #8
    StevieTNZ, thanks for posting the paper! (I read the article before and could not find the paper).

    That's also the impression I got of the paper after glancing it through. I haven't digested it more thoroughly, so I've got nothing further to add at this moment.
     
  10. Aug 28, 2013 #9

    naima

    User Avatar
    Gold Member

    Why does he say that (4) implies <Q> = 0?
     
  11. Aug 28, 2013 #10
    IMHO, the author demonstrates quite clearly, that he has not really understood decoherence theory and its implications. I wonder why this gets so much attention. To be honest I'm even surprised this got through peer review.

    Cheers,

    Jazz
     
  12. Aug 28, 2013 #11

    bhobba

    User Avatar
    Science Advisor
    Gold Member

    Indeed there isn't - but that assumption entails the core issue and is a massive can of worms. The key issue is - can a improper mixed state be considered a proper one. Observationally they are equivalent - but are they equivalent? That's the can of worms.

    Basically he is a holding to the decoherence ensemble interpretation as do I. Rather than me go through its pro's and con's here is a good paper on it:
    http://philsci-archive.pitt.edu/5439/1/Decoherence_Essay_arXiv_version.pdf
    'Postulating that although the system-apparatus is in an improper mixed state, we can interpret it as a proper mixed state super cially solves the problem of outcomes, but does not explain why this happens, how or when. This kind of interpretation is sometimes called the ensemble, or ignorance interpretation. Although the state is supposed to describe an individual quantum system, one claims that since we can only infer probabilities from multiple measurements, the reduced density operator SA is supposed to describe an ensemble of quantum systems, of which each member is in a definite state.'

    The bottom line is the conclusion:
    'Decoherence theorists have generally come to accept the criticisms above, and accept that decoherence alone does not solve the problems of outcomes, and therefore leaves the most essential question untouched.'

    I however make the assumption that observationally equivalent systems are equivalent so the problem is solved. Whether you are happy with that or not only you can decide.

    Thanks
    Bill
     
  13. Aug 28, 2013 #12
    Reading the paper today in order to avoid doing actual work, it seems to me that really all Hobson is doing is to attempt to handwave away the distinction between proper and improper mixtures, and then attaching an ignorance interpretation to the latter. Basically, he starts with an entangled state like

    [tex]|\Psi_{MO}\rangle=c_1|M_1\rangle|O_1\rangle + c_2|M_2\rangle|O_2\rangle,[/tex]

    which can be interpreted as the post-measurement state of a measurement system M and an object system O that started out in a superposition of its two accessible states [itex]|O_1\rangle[/itex] and [itex]|O_2\rangle[/itex], with the index of M indicating which state was detected.

    This can be described equivalently by the density matrix

    [tex]\rho_{MO}=|\Psi\rangle \langle\Psi| = |c_1|^2|M_1O_1\rangle\langle O_1 M_1| + |c_2|^2|M_2 O_2\rangle\langle O_2 M_2|[/tex][tex] + c_1c_2^\star|M_1 O_1\rangle\langle O_2 M_2| + c_2c_1^\star|M_2 O_2\rangle\langle O_1 M_1|[/tex]

    In order to now get a mathematical object that describes either of the systems on its own, one uses the procedure known as 'tracing out' the other subsystem, that is, one performs the partial trace over, say, the object system to get the description of the measurement system:

    [tex]\rho_M=\mathrm{tr}_O\rho_{MO}=|c_1|^2|M_1\rangle\langle M_1| + |c_2|^2|M_2\rangle\langle M_2|[/tex]

    Mathematically, this is the same object that one would use to describe a system that is prepared either in the state [itex]|M_1\rangle[/itex] or [itex]|M_2\rangle[/itex] with a respective probability of [itex]|c_1|^2[/itex] or [itex]|c_2|^2[/itex]. However---and this is where the argument goes wrong, I believe---in case this object is arrived at by tracing out the degrees of freedom of another subsystem, one can't interpret it in the way that the system is in fact in either of the states [itex]|M_1\rangle[/itex] or [itex]|M_2\rangle[/itex], and we just don't know which. This is the basis of the distinction between 'improper' (arrived at by tracing) and 'proper' (arrived at by epistemic uncertainty) mixtures, due originally to Bernard d'Espagnat, I believe.

    Now, Hobson is well aware of the distinction, but---as best I can gather---he believes it doesn't matter, as locally, we can't tell a difference between the two. This much is true. But nevertheless, there is an important and simple distinction (that has direct experimental consequences). For consider now that the state [itex]|\Psi\rangle[/itex] to describe an EPR pair, say, electrons entangled regarding their spin properties, and in the possession of Martha and Oliver (sorry for introducing nonstandard nomenclature here).

    Then, both parties would quite reasonably consider the part of the state in their possession to be described by a mixture such as [itex]\rho_M[/itex] above. But, interpreting their state as being actually in a proper mixture of the form---that is, being actually in a definite state they just don't happen to know---has important consequences, not for the outcomes obtained in their local experiements, but for the correlations between these outcomes. For they would judge that the total state must be just the tensor product of their local states, which looks like this:

    [tex]\rho_{MO}^\prime=\rho_O\otimes\rho_M=(|c_1|^2|M_1\rangle\langle M_1| + |c_2|^2|M_2\rangle\langle M_2|)\otimes(|c_2|^2|O_1\rangle\langle O_1| + |c_2|^2|O_2\rangle\langle O_2|),[/tex]

    which is a state corresponding to an epistemic mixture of the possibilities [itex]|M_1 O_1\rangle[/itex], [itex]|M_2 O_2\rangle[/itex], [itex]|M_1 O_2\rangle[/itex] and [itex]|M_2 O_1\rangle[/itex]---four possibilities, while there were only two in the original state [itex]\rho_{MO}[/itex], corresponding to the fact that entanglement means that whenever Martha detects the state [itex]|M_1\rangle[/itex], Olliver detects [itex]|O_1\rangle[/itex], and whenever she detects [itex]|M_2\rangle[/itex], Olliver detects [itex]|O_2\rangle[/itex]. In assumind that their local states are epistemic mixtures of two distinct possibilities, that is, that they really are in one of two possible states---the analogy of obtaining a definite measurement outcome---, Martha and Olliver can no longer account for the correlations between their measurements.

    Now, in response to this, Hobson appears to put up some handwaving about how it's the correlations that are in a superposition, not the states, but it's unclear to me what that's supposed to mean. And regardless, the problem lies in his assertion that in an entangled state such as [itex]|\Psi_{MO}\rangle[/itex], one can always say that (in his example) 'one photon measures the other', leading thus to definite states; but for any pair of entangled photons, I can do either of two things: I can measure the photons individually, obtaining definite outcomes; or, I can measure the entanglement of the state, which is incompatible with the assertion that the photons are in definite states (at least in standard QM). The fact that I can make that choice and must use different prescriptions to account for my observations is exactly what the measurement problem is all about; in Hobson's proposal, once two photons are entangled, and one has 'measured' the other, I would never observe any of the phenomena that make quantum mechanics so interesting---superposition, entanglement, interference, etc.
     
    Last edited: Aug 28, 2013
  14. Aug 28, 2013 #13

    bhobba

    User Avatar
    Science Advisor
    Gold Member

    I don't think its so much he doesn't understand it as he is making a key assumption and is not up-front about it. Decoherence does not solve the measurement problem without further assumptions - claims otherwise are wrong.

    Thanks
    Bill
     
  15. Aug 28, 2013 #14

    bhobba

    User Avatar
    Science Advisor
    Gold Member

    Bingo - you got it in one.

    However he may have something slightly different in mind but is not spelling it out. He states:
    'That phenomenon must be taken into account to resolve the measurement problem, he said. That means with Schrodinger's cat, the cat is no longer predicted to be both dead and alive. It is instead dead if the nucleus decays, and alive if the nucleus does not decay, just as one would expect.'

    There are some interpretations like the transactional interpretation and its variants that suggest its an influence travelling back in time from the observing apparatus.

    Thanks
    Bill
     
  16. Aug 28, 2013 #15

    naima

    User Avatar
    Gold Member

    I think the proof of the theorem must be correct.
    I understand for [itex]2Re(\beta \gamma^*)[/itex] and [itex]2Im(\beta \gamma^*)[/itex]
    but not what follows. Why have the expectation values of Qs and Ps to be = 0?
    I suppose it is basic.
     
  17. Aug 28, 2013 #16
    This was one of Kastner's criticisms of Hobson's argument:
    Measurement: still a problem in standard quantum theory
    http://arxiv.org/ftp/arxiv/papers/1308/1308.4272.pdf
     
  18. Aug 28, 2013 #17

    bhobba

    User Avatar
    Science Advisor
    Gold Member

    From the above paper:
    'rather than by invoking observation to try to explain the observation that we see collapsed states when we perform measurements.'

    I want to add the above doesn't disprove it either. Its basically philosophical mumbo jumbo (by which I mean a play on words) trying to assert that because to make decoherence work as explaining the measurement problem the simplest and easiest assumption is that observationally equivalent systems are equivalent. You are not using observation to explain observation, you are saying because observation cant tell a difference there is no difference.

    And the claim that its wrong because you are contradicting an initial assumption its in a pure state is incorrect. The system, environment, and observational apparatus start out in a pure state and by unitary evolution must remain in a pure state but because they have become entangled, by the phenomena of tracing over the environment, the observational apparatus and system are now an improper mixed state. This is the key point - no contradiction.

    In fact my favorite approach to QM makes that assumption right from the outset:
    http://arxiv.org/pdf/0911.0695v1.pdf
    Axiom 1. (Information capacity) An elementary system has the information carrying capacity of at most one bit. All systems of the same information carrying capacity are equivalent.

    It not that such a position is logically incorrect or anything like that, its simply to be correct you should be upfront about it rather than tacitly assume it.

    Thanks
    Bill
     
    Last edited: Aug 28, 2013
  19. Aug 28, 2013 #18
    After discussing this article with Bruce Rosenblum, one of the authors of Quantum Enigma, he replies

    So much for peer review.
     
  20. Aug 28, 2013 #19
    That seems to be a trend in these physics journals.
    I Guess not enough real science is being conducted so they have to grab anything that might get someone to buy their articles.

    Capitalism101
     
  21. Aug 29, 2013 #20
    Hm, taking the following at face value: "[the cat is] dead if the nucleus decays, and alive if the nucleus does not decay", this seems rather like what Everett started out from, the realization that the quantum state has a propositional content that is only definite relative to some reference; i.e. the cat's state is only definite relative to the nucleus having some definite state. The problem is, of course, that the quantum state then does not contain any information about which one of the alternatives is definite.

    Today, most people believe that Everett had something like the 'many worlds' view in mind, i.e. that each outcome in some sense 'occurs' in a different world, or that the world splits in two after such a measurement. But some, like Tim Maudlin, believe he was trying to do something more subtle, instead considering the nature of facts to be fundamentally relational, like for instance tensed facts---'it is raining today'---depend on the specification of a value of 'today' for determining their truth.

    Besides, if he were trying to do something transactional, I should think that Ruth Kastner would have been far more happy with his article than she seems to be!

    Well, I guess most people would hold the contradiction coming in at a later point. Say you have a Bell state distributed to two parties Alice and Bob, i.e.
    [tex]|\Phi^+\rangle_{AB}=\frac{1}{\sqrt{2}}(|0_A0_B\rangle+|1_A1_B\rangle),[/tex]
    then both would justifiably consider their 'local state' to be the improper mixture
    [tex]\rho_A=\rho_B=\frac{1}{2}|0\rangle\langle 0|+\frac{1}{2}|1\rangle\langle 1|.[/tex]
    Attaching now an epistemic interpretation to these states, both would believe that their system really is in either the state [itex]|0\rangle[/itex] or [itex]|1\rangle[/itex]. But this straightforwarldy entails the belief that the global state really is either of [itex]|00\rangle[/itex], [itex]|11\rangle[/itex], [itex]|01\rangle[/itex] or [itex]|10\rangle[/itex]. But this would be a state that produces different experimental results from the original Bell state, i.e. if they were simply to combine their two photons and perform measurements on the combination, they would invariably observe that they get results that can't be explained by the state being actually in either of the four possible combinations; but from this, they must conclude that their local states couldn't possibly have been in a definite state, either.

    You've really got to explain these two things in a consistent manner in order to claim a solution to the measurement problem: 1) local measurements on the Bell state always produce definite outcomes, and 2) 'global' measurements (which can of course be done perfectly locally if one just takes the whole Bell state as a specific state of a four-level quantum system) produce results incompatible with the idea that the system is in some definite state. Hobson's approach really only attacks 1), and thus, just falls short (as far as I can see, at least).

    Now, he also makes some noises in the direction of so-called modal interpretations, alleging that they're the same kind of thing that he has in mind. But of course, modal interpretations are in fact very different beasts: there, you suppose that the quantum state really only gives you an overview of possibilities ('modalities'), not the full description of physical reality. The state then has to be augmented by what is actually the case, effectively attaching a certain definite system state to the quantum mechanical state in the manner of a hidden variable.

    So your total inventory includes 1) the quantum state, [itex]|\psi\rangle=\sum_i c_i |i\rangle[/itex], and 2) the 'value state', some concrete state [itex]|i\rangle[/itex], which represents the actual 'ontic' content of the theory. What the precise value state is depends on the quantum state, hopefully in such a manner as to not be vulnerable to the argument given above; how this works explicitly differs among modal theories, ranging from 'hand-picking' the value state to giving explicit dynamics for it, similar to the velocity field equation in Bohmian mechanics (which in fact can be regarded as a certain kind of modal interpretation in which it is always the value of the position observable that is definite). Different modal interpretations also have different problems: a few have fallen prey to Kochen-Specker type contradictions, while others have the somewhat disheartening feature that the observable definite for the total system may be completely different from the observable definite for some subsystems (which I think may be regarded as a remnant of the improper-mixture problem).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Physicist disentangles 'Schrodinger's cat' debate
  1. Schrodinger's Cat (Replies: 3)

  2. Schrodinger's cat (Replies: 1)

  3. Schrodinger cat (Replies: 2)

Loading...