Entanglement spooky action at a distance

  • Thread starter Thread starter Dragonfall
  • Start date Start date
  • Tags Tags
    Entanglement
  • #51


DrChinese said:
Yes, it is confusing. The first thing to do is to go back to traditional QM. Don't try to escape it by positing that a classical explanation will be discovered that saves us. According to Bell's Theorem, that won't happen.

That leaves us with such "paradoxes" as: the Heisenberg Uncertainty Principle (which denies reality to non-commuting operators); wavefunction collapse (which appears to be non-local); virtual particles (where do they come from, and where do they go); and conservation laws (which apply to "real" particles, even space-like separated entangled ones).

Clearly, trying to get a common sense picture of these is essentially impossible as we are no closer after 80 years of trying. So we must be content, for now, with the mathematical apparatus. And that remains a solid victory for physical science.
Thanks DrChinese -- I don't view the uncertainty relations, or wavefunction collapse, or virtual particles, or the application of the law of conservation of angular momentum in certain Bell tests as paradoxical.

I think I should reread what's been written in these forums, Bell's papers, lots of other papers I've been putting off, your page, etc. and then get my thoughts in order. By the way, I'm still hoping for some sort of classically analogous way of understanding quantum entanglement and the EPR-Bell correlations. :smile:
 
Physics news on Phys.org
  • #52


ThomasT said:
Thanks DrChinese -- I don't view the uncertainty relations, or wavefunction collapse, or virtual particles, or the application of the law of conservation of angular momentum in certain Bell tests as paradoxical.

I think I should reread what's been written in these forums, Bell's papers, lots of other papers I've been putting off, your page, etc. and then get my thoughts in order. By the way, I'm still hoping for some sort of classically analogous way of understanding quantum entanglement and the EPR-Bell correlations. :smile:

Thomas,

There are plenty of misleading accounts of Bell's theorem and the current state of affairs out there. I have spent several year going through all of them and finding the diamonds in the rough. So, from my experiences, I also strongly recommend, in addition to those specific Bell papers, these two books to you on QM nonlocality - they are by far the best around:

"Quantum Nonlocality and Relativity"
Tim Maudlin
https://www.amazon.com/dp/0631232214/?tag=pfamazon01-20

"Time's Arrow and Archimedes Point"
Huw Price
https://www.amazon.com/dp/0195117980/?tag=pfamazon01-20

~M
 
  • #53


ThomasT said:
Thanks for the input. I paraphrased Pagels incorrectly I think. Here's what he actually concluded:

We conclude that even if we accept the objectivity [realism, etc.] of the microworld then Bell's experiment does not imply actual nonlocal influences. It does imply that one can instantaneously change the cross-correlation of two random sequences of events on other sides of the galaxy. But the cross-correlation of two sets of widely separated events is not a local object and the information it may contain cannot be used to violate the principle of local causality.

So, is an understanding of the entangled data (correlations) produced in EPR-Bell tests in terms of a common cause produced at emission possible. Also, what do you think of the analogy with the simplest optical Bell tests with a polariscope? Of course, if the deep physical origin of Malus' Law is a mystery, then quantum entanglement is still a mystery, but at least we'd have a classical analog.


I have a hard time understanding how Pagel could possibly have reached that conclusion. Indeed it even contradicts Bell's own conclusions. It looks confused. But, can you give me the reference?

<< So, is an understanding of the entangled data (correlations) produced in EPR-Bell tests in terms of a common cause produced at emission possible. >>

No. But, as I said earlier, there is the common past hypothesis (that the emission and detection events share a common past) that is logically possible, although extremely implausible. Bell talks about this in his paper "Free Variables and Local Causality". More plausible and successful have been the nonlocal explanations, as well as the causally symmetric explanations.

If you would like a classical analogue of Bell's inequality and theorem, read the first chapter of Tim Maudlin's book. He gives a perfectly clear and accurate classical analogue.
 
  • #54


Maaneli said:
Notice that "realism" is not at all the issue in Bell's theorem, despite the common claim that it is.

I claim it is. When Bell says that there is a simultaneous A, B and C (circa his [14] in the original), he is invoking realism. He says "It follows that c is another unit vector...". His meaning is that there if there is an a, b and c simultaneously then there must be internal consistency and there must be an outcome table that yields probabilities for all permutations of outcomes a, b and c that are non-negative.

Bell's is a reference to Einstein's realism condition, which Einstein claimed was a reasonable assumption. Bell saw this would not work and that there could not be internal consistency if there were pre-determined outcomes at all possible measurement settings. Of course, that would violate the HUP anyway but Einstein believed the HUP was not a description of reality. He said so in EPR. He assumed that at the most, the HUP was a limitation on our observational powers but not representative of reality. He said that the moon was there even when it was not being observed...
 
  • #55


DrChinese said:
I claim it is. When Bell says that there is a simultaneous A, B and C (circa his [14] in the original), he is invoking realism. He says "It follows that c is another unit vector...". His meaning is that there if there is an a, b and c simultaneously then there must be internal consistency and there must be an outcome table that yields probabilities for all permutations of outcomes a, b and c that are non-negative.

Bell's is a reference to Einstein's realism condition, which Einstein claimed was a reasonable assumption. Bell saw this would not work and that there could not be internal consistency if there were pre-determined outcomes at all possible measurement settings. Of course, that would violate the HUP anyway but Einstein believed the HUP was not a description of reality. He said so in EPR. He assumed that at the most, the HUP was a limitation on our observational powers but not representative of reality. He said that the moon was there even when it was not being observed...


I know you claim it is but it contradicts Bell's understanding of his own theorem (which should give you pause). Let me challenge you to try and come up with a logically coherent prediction in terms of an inequality, without the realism assumption. My claims is that the whole theorem falls apart into an incoherent mess if you remove realism. Whereas, you could remove locality or causality or modify Kolmogorov probability axioms, you can still construct a well-defined inequality that can be empirically tested. Let me also recommend having a look at Bell's paper "La Nouvelle Cuisine" in Speakable and Unspeakable in QM and ttn's paper "Against Realism":

Against `Realism'
Authors: Travis Norsen
Foundations of Physics, Vol. 37 No. 3, 311-340 (March 2007)
http://arxiv.org/abs/quant-ph/0607057
 
Last edited:
  • #56


<< Bell's is a reference to Einstein's realism condition, which Einstein claimed was a reasonable assumption. Bell saw this would not work and that there could not be internal consistency if there were pre-determined outcomes at all possible measurement settings. >>

No that's completely incorrect (if I correctly understand what you're trying to say). Realism is just fine even if you give up locality or causality or Kolmogorov axioms of probability. Seriously, have a look at La Nouvelle Cuisine and Travis' paper.

<< Of course, that would violate the HUP anyway but Einstein believed the HUP was not a description of reality. He said so in EPR. He assumed that at the most, the HUP was a limitation on our observational powers but not representative of reality. >>

Dude that's the point. Einstein's generic notion of realism was tested against Heisenberg's (quite frankly incoherent) positivist interpretation of the UP in QM, and was shown to be perfectly OK so long as you gave up either locality or causality. By the way, the UP was actually discovered first by Fourier in relation to classical waves, so I would prefer to call it the FUP (Fourier Uncertainty Principle).
 
Last edited:
  • #57


Maaneli said:
I know you claim it is but it contradicts Bell's understanding of his own theorem (which should give you pause). Let me challenge you to try and come up with a logically coherent prediction in terms of an inequality, without the realism assumption. My claims is that the whole theorem falls apart into an incoherent mess if you remove realism. Whereas, you could remove locality or causality or modify Kolmogorov probability axioms, you can still construct a well-defined inequality that can be empirically tested. Let me also recommend having a look at Bell's paper "La Nouvelle Cuisine" in Speakable and Unspeakable in QM and ttn's paper "Against Realism":

Against `Realism'
Authors: Travis Norsen
Foundations of Physics, Vol. 37 No. 3, 311-340 (March 2007)
http://arxiv.org/abs/quant-ph/0607057

Well, Travis and I have had a long-standing disagreement on this subject in these forums - and I am well aware of his paper (and the others like it). Norsen bends the history of EPR and Bell to suit his objective, which is obviously to push non-locality as the only viable possibility. He also bends semantics, as far as I am concerned.

You do not need Bell's additional editorial comment either (he said a lot of things afterwards), when his original paper stands fine as is. So no, it does not give me pause. Einstein was not always right, either, and if he were alive today I think he would acknowledge Bell's insight for what it was.

The situation is quite simple really:

a) If particles have no simultaneous A, B and C polarizations independent of the act of observation (as is implied, but not required, by the HUP), then there is no Bell's Theorem (per Bell's [14]). This is the realism requirement as I mentioned, and this is NECESSARY to construct the inequality. Without it, there is nothing - so your challenge is impossible as far as I am concerned.

b) Separately from Bell, the GHZ Theorem comes to an anti-realistic conclusion which does not require the locality condition. As I see it, this is fully consistent with Bell while non-local explanations are not. However, many reject GHZ and other anti-realism proofs (I'm sure you know the ones) for philosophical reasons.

c) Bell's paper was a brilliant answer to EPR's "conclusion" (completely unjustified) that realism was reasonable as an assumption. Bell showed that either Einstein's realism or his beloved locality (or both) would need to be rejected. Bell was obviously aware of Bohmian Mechanics at the time (since he mentions it), but I would hardly call that part of Bell's paper's conclusion itself.

I happen to believe that there is a causality condition implied in the Bell proof. In other words: if the future can influence the past, then that should allow a mechanism for Bell test results to be explained without resorting to a non-local or a non-realistic solution. If time is symmetric (as theory seems to suggest), then this should be possible. On the other hand, a lot of people would probably equate such a possibility to either a non-local or non-realistic solution anyway.

At any rate, failure to explicitly acknowledge the anti-realism viewpoint does a great disservice to the readers of this board. My viewpoint is mainstream opinion and Norsen's is not. As best I recall, most of the influential researchers in the area - Zeilinger, Aspect, etc. - all adopt this position: namely, that realism and locality assumptions are embedded in the Bell paper, and (given experimental results) at least one must be rejected.
 
  • #58


Maaneli said:
By the way, the UP was actually discovered first by Fourier in relation to classical waves, so I would prefer to call it the FUP (Fourier Uncertainty Principle).

That is a fairly strange way of thinking, and certainly puts you in a very small group. Even Fourier would have been surprised to find that he was the true discoverer of the HUP a hundred years before Heisenberg (and long before the existence of atoms was consensus). Do you just not like Heisenberg for some reason?
 
  • #59


Agree with everything Dr. Chinese said. I'd also add that the interpretive difficulties always arise when we stray from the math and delve into philosophy using cushy terms like realism, determinism, superdeterminism, and the like. Of course I'm guilty of it too. :)

We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.
 
  • #60
DrChinese said:
Well, Travis and I have had a long-standing disagreement on this subject in these forums - and I am well aware of his paper (and the others like it). Norsen bends the history of EPR and Bell to suit his objective, which is obviously to push non-locality as the only viable possibility. He also bends semantics, as far as I am concerned.


Well I disagree with your assessement of his work. Travis is quite accurate in his characterization of Bell's theorem, even though I have some disagreements with him about what conclusions we can draw about it today. Also, he doesn't bend semantics - he's just very meticulous and high on philosophical and logical rigor, which is something everyone should strive for in discussing QM foundations.




DrChinese said:
You do not need Bell's additional editorial comment either (he said a lot of things afterwards), when his original paper stands fine as is. So no, it does not give me pause.


Yes you do need Bell's additional commentaries from his other papers. There are lot's of subtle and implicit assumptions in his original paper that he made much more explicit and tried to justify in other papers like "La Nouvelle Cuisine", where he clarifies his definition of local causality, and "Free Variables and Local Causality", where he justifies his assumption of causality but also emphasizes the additional possibilities involved in giving up the causality assumption.




DrChinese said:
Einstein was not always right, either, and if he were alive today I think he would acknowledge Bell's insight for what it was.


I agree Einstein was not always right and that he would probably acknowledge Bell's theorem; but I suspect we have different opinions about what exactly Bell's insight is.



DrChinese said:
The situation is quite simple really:

a) If particles have no simultaneous A, B and C polarizations independent of the act of observation (as is implied, but not required, by the HUP), then there is no Bell's Theorem (per Bell's [14]). This is the realism requirement as I mentioned, and this is NECESSARY to construct the inequality. Without it, there is nothing - so your challenge is impossible as far as I am concerned.


Yes, this was exactly my point. I think you misunderstood me before. Indeed the form of realism you generally suggest is an absolutely necessary pin in the logic of the theorem (or any physics theorem for that matter; in fact, that realism assumption is no different than the realism assumptions in, say, the fluctuation-dissipation theorem or Earnshaw's theorem, both of which are theorems in classical physics). But it is completely false to say that realism is necessarily falsified by a violation of the Bell inequalities. There are other assumptions in Bell's theorem, if you recall, which can be varied without making the general mathematical logic of the inequality derivation inconsistent. They are, once again,

1) Kolmogorov classical probability axioms are valid.
2) locality is valid (the propagation speed for causal influences between two events is bounded by the speed of light, c).
3) causality is valid ("future" or final measurement settings are "free" or random variables).

One can drop anyone of these assumptions and it wouldn't falsify realism. Well, if you drop 3) and replace it with a common past hypothesis or a form of backwards causation as Huw Price and others have suggested, then you just have to modify your notion of realism in a particular way (there is a literature on this you know). That's not the same however as saying that realism gets falsified.




DrChinese said:
b) Separately from Bell, the GHZ Theorem comes to an anti-realistic conclusion which does not require the locality condition. As I see it, this is fully consistent with Bell while non-local explanations are not. However, many reject GHZ and other anti-realism proofs (I'm sure you know the ones) for philosophical reasons.


What are you talking about? Of course the GHZ theorem assumes a locality condition, just as Bell does. And no it doesn't come to any anti-realistic conclusion whatsoever. That's a very serious error. If you don't understand any of that, then you have to return to some basics. In particular, have a read of this recent article by Zeilinger and Aspelmeyer.

http://physicsworld.com/cws/article/print/34774;jsessionid=B55E9395A8ED10334930389C70494F9B

So far, all tests of both Bell’s inequalities and on three entangled particles (known as GHZ experiments) (see “GHZ experiments”) confirm the predictions of quantum theory, and hence are in conflict with the joint assumption of locality and realism as underlying working hypotheses for any physical theory that wants to explain the features of entangled particles.

Yes, they do talk about GHZ as if it puts constraints on "local realism"; but, again, I have shown that realism is a complete red herring in the context of Bell or GHZ. And of course I am not the only person with this view. It is quite well understood by the top philosophers of physics and physicists in QM foundations like David Albert, Tim Maudlin, Huw Price, Sheldon Goldstein, Guido Bacciagaluppi, Jeff Bub, David Wallace, Harvey Brown, Simon Saunders, etc., etc.. Zeilinger and Apelmeyer are quite in the minority in that understanding among QM foundations specialists, and that should give you pause for concern on that particular issue. But to make this even more clear to you, the deBB theory (a nonlocal realist contextual HV theory) perfectly explains the results of GHZ, which Zeilinger also acknowledges himself (because he understands deBB involves a joint assumption of realism and nonlocality). So there is no refutation of realism on its own at all in GHZ.

Also, it just occurred to me that you might be confusing the Leggett inequality (which that article also discusses) with the GHZ inequality. I highly recommend getting clear on those differences.



DrChinese said:
c) Bell's paper was a brilliant answer to EPR's "conclusion" (completely unjustified) that realism was reasonable as an assumption. Bell showed that either Einstein's realism or his beloved locality (or both) would need to be rejected. Bell was obviously aware of Bohmian Mechanics at the time (since he mentions it), but I would hardly call that part of Bell's paper's conclusion itself.


That's a total mischaracterization of the EPRB conclusion and of Bell's theorem. Bell showed that Either locality or causality would need to be rejected. By the way, even though deBB was not a part of Bell's original paper, in his other papers he mentions it as a counterexample to the flawed misunderstandings physicists had (and still have) that his theorem refutes the possibility of Einsteinian realism in QM.




DrChinese said:
I happen to believe that there is a causality condition implied in the Bell proof. In other words: if the future can influence the past, then that should allow a mechanism for Bell test results to be explained without resorting to a non-local or a non-realistic solution. If time is symmetric (as theory seems to suggest), then this should be possible. On the other hand, a lot of people would probably equate such a possibility to either a non-local or non-realistic solution anyway.


Yes of course the causality condition is in Bell's theorem. That's not controversial or new. He discusses it in more detail in "La Nouvelle Cuisine" and "Free Variables and Local Causality" (see why it's a good idea to read his other papers?) and leaves open the possibility of some form of "superdeterminism", even though he himself regards it as very implausible. Later people like O. Costa de Beauregard, Huw Price, and others since have advanced the idea of using backwards causation to save locality and show how Bell and GHZ inequalities could be violated. Price discusses this at length in his book

"Time's Arrow and Archimedes Point"
http://www.usyd.edu.au/time/price/TAAP.html

and his papers:

Backward causation, hidden variables, and the meaning of completeness. PRAMANA - Journal of Physics (Indian Academy of Sciences), 56(2001) 199—209.
http://www.usyd.edu.au/time/price/preprints/QT7.pdf

Time symmetry in microphysics. Philosophy of Science 64(1997) S235-244.
http://www.usyd.edu.au/time/price/preprints/PSA96.html

Toy models for retrocausality. Forthcoming in Studies in History and Philosophy of Modern Physics, 39(2008).
http://arxiv.org/abs/0802.3230

You may also be interested to know that there exists a deBB model developed by Sutherland that implements backwards causation, is completely local, and reproduces the empirical predictions of standard QM:

Causally Symmetric Bohm Model
Authors: Rod Sutherland
http://arxiv.org/abs/quant-ph/0601095
http://www.usyd.edu.au/time/conferences/qm2005.htm#sutherland
http://www.usyd.edu.au/time/people/sutherland.htm

and his older work:

Sutherland R.I., 'A Corollary to Bell's Theorem', Il Nuovo Cimento B 88, 114-18 (1985).

Sutherland R.I., 'Bell's Theorem and Backwards-in-Time Causality', International Journal of Theoretical Physics 22, 377-84 (1983).

And just to emphasize, all these backwards causation models involve some form of realism.



DrChinese said:
At any rate, failure to explicitly acknowledge the anti-realism viewpoint does a great disservice to the readers of this board. My viewpoint is mainstream opinion and Norsen's is not. As best I recall, most of the influential researchers in the area - Zeilinger, Aspect, etc. - all adopt this position: namely, that realism and locality assumptions are embedded in the Bell paper, and (given experimental results) at least one must be rejected.


Whether your viewpoint is "mainstream" (and you still have to define what "mainstream" means to make it meaningful) or not is completely irrelevant. All that is relevant is the logical validity and factual accuracy of your understanding of these issues. But, I could tell you that among QM foundations specialists, such as people who participate in the annual APS conference on foundations of physics (which I have done so for the past 3 consecutive years):

New Directions in the Foundations of Physics
American Center for Physics, College Park, April 25 - 27, 2008
http://carnap.umd.edu/philphysics/conference.html

your opinion is quite the minority. Furthermore, I didn't imply that locality isn't embedded in Bell's theorem or that realism isn't embedded in Bell's theorem. I just said that the crucial conclusion of Bell's theorem (and Bell's own explicitly stated conclusion) is that QM is not a locally causal theory, not that it is not a locally real theory, whatever that would mean.

Let me also emphasize that unlike what you seem to be doing in characterizing Bell's theorem as a refutation of realism, Zeilinger acknolwedges that nonlocal hidden variable theories like deBB are compatible with experiments, even if he himself is an 'anti-realist'. By the way, anti-realists such as yourself or Zeilinger still have the challenge to come up with a solution to the measurement problem and derive the quantum-classical limit. Please don't try to invoke decoherence, since the major developers and proponents of decoherence theory like Zurek, Zeh, Joos, etc., are actually realists themselves - and even they admit that decoherence theory has not and probably will never on its own solves the measurement problem or account for the quantum-classical limit. On the other hand, it is well acknolwedged that nonlocal realist theories like deBB plus decoherence do already solve the problem of measurement and already accurately (even if not yet perfectly) describe the quantum-classical limit. So by my assessment, it is the anti-realist crowd that is in the minority and has much to prove.
 
Last edited by a moderator:
  • #61


In order to reject a theory based on Bell's theorem alone, that theory should have the property that the events in one part of the experimental setup (source, detector 1, detector 2) should not depend on the other parts (the statistical independence assumption).

The only theories that satisfy this assumption (and are therefore unable to reproduce QM's predictions) are the "billiard ball"-type ones (no long range force, interactions only at direct collisions). Incidentally, Maxwell's theory of electromagnetism, Newtonian gravity, or Einstein's GR all have long range forces therefore the statistical independence assumption is denied. Therefore, a modification of maxwell's theory, while remaining local and realistic could in principle reproduce QM's predictions.

So, the saying that local realism is excluded by Bell's theorem is patently false.
 
  • #62


Maaneli said:
1. Yes, this was exactly my point. I think you misunderstood me before. Indeed the form of realism you generally suggest is an absolutely necessary pin in the logic of the theorem

2. Later people like O. Costa de Beauregard, Huw Price, and others since have advanced the idea of using backwards causation to save locality and show how Bell and GHZ inequalities could be violated. Price discusses this at length in his book

"Time's Arrow and Archimedes Point"
http://www.usyd.edu.au/time/price/TAAP.html

and his papers:

Backward causation, hidden variables, and the meaning of completeness. PRAMANA - Journal of Physics (Indian Academy of Sciences), 56(2001) 199—209.
http://www.usyd.edu.au/time/price/preprints/QT7.pdf

Time symmetry in microphysics. Philosophy of Science 64(1997) S235-244.
http://www.usyd.edu.au/time/price/preprints/PSA96.html

Toy models for retrocausality. Forthcoming in Studies in History and Philosophy of Modern Physics, 39(2008).
http://arxiv.org/abs/0802.3230

You may also be interested to know that there exists a deBB model developed by Sutherland that implements backwards causation, is completely local, and reproduces the empirical predictions of standard QM:

Causally Symmetric Bohm Model
Authors: Rod Sutherland
http://arxiv.org/abs/quant-ph/0601095
http://www.usyd.edu.au/time/conferences/qm2005.htm#sutherland
http://www.usyd.edu.au/time/people/sutherland.htm

and his older work:

Sutherland R.I., 'A Corollary to Bell's Theorem', Il Nuovo Cimento B 88, 114-18 (1985).

Sutherland R.I., 'Bell's Theorem and Backwards-in-Time Causality', International Journal of Theoretical Physics 22, 377-84 (1983).

And just to emphasize, all these backwards causation models involve some form of realism.

3. Whether your viewpoint is "mainstream" (and you still have to define what "mainstream" means to make it meaningful) or not is completely irrelevant. All that is relevant is the logical validity and factual accuracy of your understanding of these issues. But, I could tell you that among QM foundations specialists, such as people who participate in the annual APS conference on foundations of physics (which I have done so for the past 3 consecutive years):

New Directions in the Foundations of Physics
American Center for Physics, College Park, April 25 - 27, 2008
http://carnap.umd.edu/philphysics/conference.html

your opinion is quite the minority. Furthermore, I didn't imply that locality isn't embedded in Bell's theorem or that realism isn't embedded in Bell's theorem. I just said that the crucial conclusion of Bell's theorem (and Bell's own explicitly stated conclusion) is that QM is not a locally causal theory, not that it is not a locally real theory, whatever that would mean.

4. Let me also emphasize that unlike what you seem to be doing in characterizing Bell's theorem as a refutation of realism, Zeilinger acknolwedges that nonlocal hidden variable theories like deBB are compatible with experiments, even if he himself is an 'anti-realist'. By the way, anti-realists such as yourself or Zeilinger still have the challenge to come up with a solution to the measurement problem and derive the quantum-classical limit. Please don't try to invoke decoherence, since the major developers and proponents of decoherence theory like Zurek, Zeh, Joos, etc., are actually realists themselves - and even they admit that decoherence theory has not and probably will never on its own solves the measurement problem or account for the quantum-classical limit. On the other hand, it is well acknolwedged that nonlocal realist theories like deBB plus decoherence do already solve the problem of measurement and already accurately (even if not yet perfectly) describe the quantum-classical limit. So by my assessment, it is the anti-realist crowd that is in the minority and has much to prove.

1. We agree on this point, and that was my issue.

2. Thank you for these references, there are a couple I am not familiar with and would like to study.

3. The issue with "mainstream" is that mainstream theory can be wrong - of course - but I think it is helpful for most folks to learn the mainstream before they reject it.

I see your point that there is a more diverse group out there, and so maybe the idea of "mainstream" is too broad to be so easily characterized. At any rate, I was not trying to say that the "anti-realism" view was the mainstream. I was trying to say that the mainstream view is that local realistic theory is not viable.

4. Repeating that I was not trying to advance the cause of "non-realism" other than showing it is one possibility. I agree that non-local solutions should be viable. In a lot of ways, they make more intuitive sense than non-realism anyway.

BTW, my point about GHZ was not that it proved non-realism over non-locality. It is another of the no-go proofs - of which there are several - which focus on the realism assumption. These proofs are taken in different ways by the community. Since we don't disagree on the main point, we can drop this particular sidebar.
 
  • #63


ueit said:
In order to reject a theory based on Bell's theorem alone, that theory should have the property that the events in one part of the experimental setup (source, detector 1, detector 2) should not depend on the other parts (the statistical independence assumption).

The only theories that satisfy this assumption (and are therefore unable to reproduce QM's predictions) are the "billiard ball"-type ones (no long range force, interactions only at direct collisions). Incidentally, Maxwell's theory of electromagnetism, Newtonian gravity, or Einstein's GR all have long range forces therefore the statistical independence assumption is denied. Therefore, a modification of maxwell's theory, while remaining local and realistic could in principle reproduce QM's predictions.

So, the saying that local realism is excluded by Bell's theorem is patently false.

I seem to be stuck in the middle again... :)

There is no viable local realistic theory on the table to discuss at this point. You don't have one to offer, and "heroic" efforts by Santos and others (with varations on stochastic ideas) have so far fallen well short of convincing much of anyone. Bell's Theorem shows us how to dissect and attack such attempts. So I strongly disagree.
 
  • #64


Maaneli said:
IAgainst `Realism'
Authors: Travis Norsen
Foundations of Physics, Vol. 37 No. 3, 311-340 (March 2007)
http://arxiv.org/abs/quant-ph/0607057

By the way, it might surprise you (and Travis for that matter) to learn that I have had a link to another of his earlier papers - somewhat similar to your citation - on my website for nearly 3 years:

Travis Norsen: EPR and Bell Locality , arXiv (2005)

...So please don't think that I limit my viewpoints. I respect differences of opinion and think they are healthy. But I also think that on this board, opinions should be distinguished from mainstream thought for the sake of those who don't follow things to the Nth degree.
 
  • #65


Thought experiment. Suppose we simulate on a classical computer (the bits are manipulated using local deterministic rules) a world described by quantum mechanics. In this world, observers live who can do experiments and verify that Bell's inequality is violated in exactly the way as predicted by QM. Nevertheless the world they live in, is ultimately described by the rules according to which the bits in the computer are manipulated.
 
  • #66


Maaneli said:
I have a hard time understanding how Pagel could possibly have reached that conclusion. Indeed it even contradicts Bell's own conclusions. It looks confused. But, can you give me the reference?
"The Cosmic Code: quantum physics as the language of nature"

Maaneli said:
<< So, is an understanding of the entangled data (correlations) produced in EPR-Bell tests in terms of a common cause produced at emission possible?>>
No. But, as I said earlier, there is the common past hypothesis (that the emission and detection events share a common past) that is logically possible, although extremely implausible. Bell talks about this in his paper "Free Variables and Local Causality". More plausible and successful have been the nonlocal explanations, as well as the causally symmetric explanations.
Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations.

Maaneli said:
If you would like a classical analogue of Bell's inequality and theorem, read the first chapter of Tim Maudlin's book. He gives a perfectly clear and accurate classical analogue.
I take it you didn't like my polariscope analogy? (I really thought I had something there. :smile:)
I read what I could of Maudlin's first chapter at Google books. Nothing new or especially insightful there. I've read Price's book -- didn't like it. But thanks for the references and nice discussion with DrChinese et al.
I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed.
 
  • #67


Interesting idea Count. I did something like this (I was a computer science guy in a previous life).

It is impossible using a standard - non-quantum - computer to simulate the results of EPRB experiments without utilizng both polarizer settings in calculating the odds of any particular photon passing.

The "Does Photon Pass Polarizer x" function simply cannot be written without reference to the other polarizer while still obtaining the quantum results.

If you try to do something elaborate - say, in the "generate entangled photons" function, you pre-program both of them for every conceiveable polarizer angle - you come close to the quantum results, but not perfectly.

In order to reproduce the quantum results, you have to either:
1) allow the two photons to "know" the polarizer settings before they've reached them (some kind of superdeterminism) and agree ahead of time on how they're going to behave; or
2) check to see whether the twin has reached its polarizer yet; if not, just go with 50/50. If it has reached, behave in a complimentary way (non-locaity).

The third option would be some kind of many-world simulation where we let objects continue to evolve in super-position until someone observes both but I thought that a little too complicated to code.
 
  • #68
Bell's Inequality

peter0302 said:
Each time Alice eats a tomato, Bob is more likely to eat a cucumber. Each time Alice can't finish her broccoli, Bob eats his carrots more often.
QUOTE]

I have read this thread with great interest and marvelled at the logic above - why should Bob eat a cucumber when Alice eats a tomato?

So I have done a 'plastic balls' version of Bell's Inequality in what I hope is the simplest possible depiction.
I need to add a jpg of the quantum violation of Bells Inequality, but cannot see quite how to do it.

Can someone offer advice? How could I devise a plastic Balls jpg of the QM version of events?
http://www.ronsit.co.uk/weird_at_Heart.asp
 
Last edited by a moderator:
  • #69


Maaneli said:
I have a hard time understanding how Pagel could possibly have reached that conclusion.
In case you haven't had a chance to check out Pagel's book, I can summarize his argument.

Nonlocality has to do with the spatially separated setups producing changes in each other via spacelike separated events.

Pagel's argument against nonlocality (wrt EPR-Bell tests at least) hinges on the randomness of the individual results. Quantitatively, we know that A is not producing changes in B and vice versa. Qualitatively, there's no way to know. The individual probabilities at one end remain the same no matter what happens at the other end. Speaking of the conditional probability at B given a detection at A is meaningless. The probabilities only have physical meaning wrt the accumulation of results of large numbers of individual trials. Because of the randomness of individual data sequences nonlocality in EPR-Bell tests can't be conclusively established.

If the sequences from A and B are matched appropriately, then information about changes in the settings of the separated polarizers is in the cross-correlation. In effect, the global experimental design yields the quantum correlations -- which, in my view, is what would be expected if the deep cause of the entanglement is via common emission, or interaction, or transmission of a common torque, etc. (My polariscope analogy comes in handy here I think.)

Apparently, the only thing preventing a consensus wrt the common origin explanation for the correlations is that Bell inequalitities are interpreted to exclude the possibility that the filters at A and B might have been filtering identical incident disturbances for any given pair of detection attributes.
 
  • #70


peter0302 said:
We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.
This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? I think this is possible, maybe even likely, and, if so, it would seem to reinforce the Copenhagen approach to interpreting the formalism and application of the quantum theory. (ie., we can't possibly know the truth of a deep quantum reality, so there's no scientific point in talking about it)
 
  • #71


ThomasT said:
This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? I think this is possible, maybe even likely, and, if so, it would seem to reinforce the Copenhagen approach to interpreting the formalism and application of the quantum theory. (ie., we can't possibly know the truth of a deep quantum reality, so there's no scientific point in talking about it)
Oh absolutely. Don't get me wrong, I'd love a physical explanation as well, or proof that none is possible. But the starting point has to be the math, not the philosophy. A lot of these interpretive endeavors tend to drift a long way from science.

At the risk of opening another flame war, this is the reason I prefer MWI because it throws out assumptions that are necessitated only by our subjective perceptions (wavefunction collapse) rather than by objective evidence. That should be the starting point. Then let's find whee it leads.
 
  • #72


Originally Posted by peter0302:

We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.

--------------------------------------------------------

I agree with the above. And we should all recall that EPR started the debate with their terms and definitions, especially that there should be a "more complete" specification of the system possible (or else the reality of one particle would be dependent of the nature of the measurement done on another). It does not appear that a more complete specification of the system is possible regardless of what assumption you end up rejecting.
 
  • #73


Chinese,

It does not appear that a more complete specification of the system is possible regardless of what assumption you end up rejecting.


That's blatantly false (if I understand you correctly), as deBB and GRW and stochastic mechanical theories have proven. You may not like these more complete specifications of the system for various philosophical reasons, but it is dishonest to deny that they exist and are empirically equivalent to the standard formalism.
 
  • #74


DrChinese said:
Originally Posted by peter0302:

We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.

In fairness, the boldface words are actually not true either. Aspect's original experiments were heavily flawed with various loopholes, and it was quite easy to account for those results with locally causal hidden variable models. Also, not even Zeilinger or Kwiat would claim the Bell tests are conclusive today. Because they acknowledge that no experiment has yet been done that simultaneously closes the detection efficiency loophole AND the separability loophole AND cannot be equally well explained by LCHV models like Santos-Marshall stochastic optics and Fine-Maudlin Prism models of GHZ correlations.
 
Last edited:
  • #75


peter0302 said:
I'd love a physical explanation as well, or proof that none is possible. But the starting point has to be the math, not the philosophy. A lot of these interpretive endeavors tend to drift a long way from science.
Agreed. But I think it's worth the effort to sort out the semantics of the interpretations.
peter0302 said:
At the risk of opening another flame war, this is the reason I prefer MWI because it throws out assumptions that are necessitated only by our subjective perceptions (wavefunction collapse) rather than by objective evidence. That should be the starting point. Then let's find whee it leads.
Wavefunction collapse or reduction is the objective dropping of terms that don't correspond to a recorded experimental result. That is, once a qualitative result is recorded, then the wavefunction that defined the experimental situation prior to that is reduced to the specification of the recorded result.

If one reifies the wavefunction, then one is saddled with all sorts of (in my view) unnecessary baggage -- including, possibly, adherance to MWI, or MMI, or some such interpretation. :smile:
 
  • #76


DrChinese said:
I agree with the above. And we should all recall that EPR started the debate with their terms and definitions, especially that there should be a "more complete" specification of the system possible (or else the reality of one particle would be dependent of the nature of the measurement done on another). It does not appear that a more complete specification of the system is possible regardless of what assumption you end up rejecting.
Exactly. And so instead, what we have are interpretations which are not "more complete" so much as they are "more complicated" - since they (as of yet) make no different or more accurate predictions than the orthodox model EPR criticized.

Not that I don't think the research is worthwhile. I certainly do. I'm still hopeful there is something more complete, but I doubt any of the interpretations we have now (at least in their current forms) are going to wind up winning in the end.

Wavefunction collapse or reduction is the objective dropping of terms that don't correspond to a recorded experimental result. That is, once a qualitative result is recorded, then the wavefunction that defined the experimental situation prior to that is reduced to the specification of the recorded result.
Yes, but what's your physical (objective, real) justification for doing so? Plus, it's not defined objectively in Bohr's QM. It's done differently from experiment to experiment, and no one really agrees whether a cat can do it or not, let alone how.

Were you not the one who wanted a physical explanation? :)
 
  • #77


DrChinese said:
1. We agree on this point, and that was my issue.

2. Thank you for these references, there are a couple I am not familiar with and would like to study.

4. Repeating that I was not trying to advance the cause of "non-realism" other than showing it is one possibility. I agree that non-local solutions should be viable. In a lot of ways, they make more intuitive sense than non-realism anyway.

BTW, my point about GHZ was not that it proved non-realism over non-locality. It is another of the no-go proofs - of which there are several - which focus on the realism assumption. These proofs are taken in different ways by the community. Since we don't disagree on the main point, we can drop this particular sidebar.



1. OK.

2. You're welcome.

4. I have not yet seen any evidence that local non-realism is a viable explanation of Bell inequality violations. I challenge you to come up with a mathematical definition of non-realist locality. And I challenge you to come up with a measurement theory based on solipsism that solves the measurement problem, and allows you to completely derive the quantum-classical limit.

About GHZ, you said it is another no-go proof that focuses on the realism assumption. That's just not true. It focuses just as much on locality and causality as Bell's theorem does.
 
  • #78


ThomasT said:
"The Cosmic Code: quantum physics as the language of nature"


Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations.


I take it you didn't like my polariscope analogy? (I really thought I had something there. :smile:)
I read what I could of Maudlin's first chapter at Google books. Nothing new or especially insightful there. I've read Price's book -- didn't like it. But thanks for the references and nice discussion with DrChinese et al.
I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed.


Thanks for the references.

<< Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations. >>

At first I thought you meant the experimental designs lend themselves to detection loopholes and such. But then you say

<< This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? >>

So you clearly believe that Aspect and others have confirmed Bell inequality violations. You also say that

<< I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed. >>

So, honestly, it just sounds to me like you have refused to understand Bell's theorem for what it is, or have been told what it is, and are just in denial about it. In that case, no one can help you other than to say that you seem to be letting your subjective, intuitive, biases prevent you from learning about this subject. And that won't get you anywhere.
 
  • #79


peter0302 said:
Interesting idea Count. I did something like this (I was a computer science guy in a previous life).

It is impossible using a standard - non-quantum - computer to simulate the results of EPRB experiments without utilizng both polarizer settings in calculating the odds of any particular photon passing.

The "Does Photon Pass Polarizer x" function simply cannot be written without reference to the other polarizer while still obtaining the quantum results.

If you try to do something elaborate - say, in the "generate entangled photons" function, you pre-program both of them for every conceiveable polarizer angle - you come close to the quantum results, but not perfectly.

In order to reproduce the quantum results, you have to either:
1) allow the two photons to "know" the polarizer settings before they've reached them (some kind of superdeterminism) and agree ahead of time on how they're going to behave; or
2) check to see whether the twin has reached its polarizer yet; if not, just go with 50/50. If it has reached, behave in a complimentary way (non-locaity).

The third option would be some kind of many-world simulation where we let objects continue to evolve in super-position until someone observes both but I thought that a little too complicated to code.

Yes, the code when interpreted as describing a classical world will describe it as being non-local. But when you run the computer, the internal state of the computer will evolve in a local deterministic way.
 
  • #80


Well, sure. Is your point that we could be living in a computer simulation?

Even if so, locality is a supposed rule of the simulation. If the simulation is comparing both polarizer angles, that's cheating. :)
 
  • #81


I saw this rather late, sorry for my late response...

ThomasT said:
vanesch says that violations of Bell inequalities mean that the incident disturbances associated with paired detection attributes cannot have a common origin. This would seem to mean that being emitted from the same atom at the same time does not impart to the opposite-moving disturbances identical properties.

What I said was that the correlations as seen in an Aspect-like experiment, and as predicted by quantum theory, cannot be obtained by "looking simply at a common cause". That is, you cannot set up a classical situation where you look at a common property of something, and obtain the same correlations as those from quantum mechanics. This is because classically, we only know of two ways to have statistical correlations C(A,B): A causes B (or B causes A) is one way, and A and B have a common cause C is the other effect. Example of the first:
- A is "setting of the switch" and B is "the light is on"
clearly, because setting the switch causes the light to be on or off, we will find a correlation between both.
Example of the second:
- you drive a Ferrari and you have a Rolex.
It is not true that driving Ferrari's makes you have a Rolex, or that putting on a Rolex makes you drive a Ferrari. So there's not "A causes B" or "B causes A". However, being extremely rich can cause you to buy a Ferrari as well as a Rolex. So there was a common cause: "being rich".

Well, Bell's theorem is a property that holds for the second kind of correlations.

So the violation of that theorem by observed or quantum-mechanically predicted correlations means that it cannot be a correlation that can be explained entirely by "common cause".

And yet, in the hallmark 1984 Aspect experiment using time-varying analyzers, experimenters were very careful to ensure that they were pairing detection attributes associated with photons emitted simultaneously by the same atom.

Yes, of course. That's because we have to look for *entangled* photons, which NEED to have a common source. But the violation of Bell's theorem simply means that it is not a "common cause of the normal kind", like in "being rich".

I was remembering last night something written by the late Heinz Pagels about Bell's theorem where he concludes that nonlocality (ie. FTL transmissions) can't be what is producing the correlations.

That's wrong: Bohmian mechanics explicitly shows how action-at-a-distance can solve the issue. In fact, that's not surprising either. If you have action at a distance, (if A can cause B or vice versa) then all possible correlations are allowed, and there's no "Bell theorem" being contradicted. What's the problem with the EPR kind of setups is that the "causes" (the choices of measurement one makes) are space-like separated events, so action-at-a-distance would screw up with relativity. So people (like me) sticking to relativity refuse to consider that option. But it is a genuine option: it is actually by far the "most common sense" one.
 
  • #82


Maaneli said:
Thanks for the references.

<< Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations. >>

At first I thought you meant the experimental designs lend themselves to detection loopholes and such.
No, I had the emission preparations and the data matching mechanisms in mind. So, even if all the loopholes were conclusively closed and the inequalities were conclusively violated experimentally, I'd still think that the experimental designs have common emission cause written all over them.

Maaneli said:
But then you say

<< This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? >>

So you clearly believe that Aspect and others have confirmed Bell inequality violations.

Well, I did believe that, but now I suppose I'll have to look a bit closer at the loopholes you're referred to.

But it won't matter if the loopholes are all closed and the violations are conclusive. One can talk all one wants to about nonlocality, however, like, say, a seamless, nonparticulate medium that can't possibly be detected -- what would be the point?

Maaneli said:
You also say that

<< I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed. >>

So, honestly, it just sounds to me like you have refused to understand Bell's theorem for what it is, or have been told what it is, and are just in denial about it.

I think that those who understand Bell's theorem as leading to the conclusion that there must be FTL physical propagations in nature might have missed some important subtleties regarding its application and ultimate meaning.

The locality assumption is that events at A cannot be directly causally affecting events at B during any given coincidence interval. Quantitatively at least, we know that this assumption is affirmed experimentally. Since there never will be a way to affirm or deny it qualitatively, I conclude that the assumption of locality is the best bet -- regardless what anyone thinks that Bell has shown.

And, because of the way these experiments must be set up and run, I conclude that the assumption of common cause is also a best bet regarding the deep cause(s) of the quantum experimental phenomena that, collectively, conform to the technical requirements for quantum entanglement.

So, yes, I'm in denial about what I think you (and lots of others) think the meaning of Bell's theorem is. But thanks for the good discussions and references, and I'll continue to read and think and keep an open mind about this (think of my denial as a sort of working assumption), and when I get another flash of insight (like the polariscope analogy), then I'll let you know. :smile:

When is a locality condition not, strictly speaking, a locality condition?
 
  • #83


Maaneli said:
4. I have not yet seen any evidence that local non-realism is a viable explanation of Bell inequality violations. I challenge you to come up with a mathematical definition of non-realist locality.

You are a challenging kind of guy... :)

All you need to do to explain the Bell Inequality violation is to say that particles do NOT have well-defined non-commuting attributes when not being observed. You deny then that there is an A, B and C in Bell's [14], as previously mentioned. (Denying this assumption is completely in keeping with the HUP anyway, even if not strictly required by it.)

And there is experimental evidence as well, but you likely don't think it applies. There is the work of Groeblacher et al which I am sure you know. I accept these experiments as valid evidence but think more work needs to be done before it is considered "iron clad".

So, there is a mathematical apparatus already: QM and relativity. So dumping non-locality as an option does not require anything new. The reason I think non-realism is appealling is because it - in effect - elevates the HUP but does not require new forces, mechanisms, etc. We can also keep c as a speed limit, and don't need to explain why most effects respect c but entanglement does not.

By the way, you might have been a bit harsh on ThomasT. We each have our own (biased?) viewpoints on some of these issues, including you. Clearly, you reject evidence against non-locality and reject non-realism as a viable alternative.
 
  • #84


vanesch said:
What I said was that the correlations as seen in an Aspect-like experiment, and as predicted by quantum theory, cannot be obtained by "looking simply at a common cause". That is, you cannot set up a classical situation where you look at a common property of something, and obtain the same correlations as those from quantum mechanics.
Sorry if I misparaphrased you, because you've helped a lot in elucidating these issues.

There is a classical situation which I think analogizes what is happening in optical Bell tests -- the polariscope. The measurement of intensity by the detector behind the analyzing polarizer in a polariscopic setup is analogous to the measurement of rate of coincidental detection in simple optical Bell setups. Extending between the two polarizers in a polariscopic setup is a singular sort of optical disturbance. That is, the disturbance that is transmitted by the first polarizer is identical to the disturbance that's incident on the analyzing polarizer. In an optical Bell setup, it's assumed that for a given emitted pair the disturbance incident on the polarizer at A is identical to the disturbance that's incident on the polarizer at B. Interestingly enough, both these setups produce a cos^2 functional relationship between changes in the angular difference of the crossed polarizers and changes in the intensity (polariscope) or rate of coincidence (Bell test) of the detected light.

vanesch said:
This is because classically, we only know of two ways to have statistical correlations C(A,B): A causes B (or B causes A) is one way, and A and B have a common cause C is the other effect. Example of the first:
- A is "setting of the switch" and B is "the light is on"
clearly, because setting the switch causes the light to be on or off, we will find a correlation between both.
Example of the second:
- you drive a Ferrari and you have a Rolex.
It is not true that driving Ferrari's makes you have a Rolex, or that putting on a Rolex makes you drive a Ferrari. So there's not "A causes B" or "B causes A". However, being extremely rich can cause you to buy a Ferrari as well as a Rolex. So there was a common cause: "being rich".

Well, Bell's theorem is a property that holds for the second kind of correlations.

So the violation of that theorem by observed or quantum-mechanically predicted correlations means that it cannot be a correlation that can be explained entirely by "common cause".
And yet it's a "common cause" (certainly not of the ordinary kind though) assumption that underlies the construction and application of the quantum mechanical models that pertain to the Bell tests, as well as the preparation and administration of the actual experiments.


vanesch said:
Yes, of course. That's because we have to look for *entangled* photons, which NEED to have a common source. But the violation of Bell's theorem simply means that it is not a "common cause of the normal kind", like in "being rich".
OK, the correlations are due to a unusual sorts of common causes then. This is actually easier to almost visualize in the experiments where they impart a similar torque to relatively large groups of atoms. The entire, separate groups are then entangled with respect to their common zapping. :smile: Or, isn't this the way you'd view these sorts of experiments?


vanesch said:
Bohmian mechanics explicitly shows how action-at-a-distance can solve the issue.
The problem with instantaneous-action-at-a-distance is that it's physically meaningless. An all-powerful invisible elf would solve the problem too. Just like instantaneous-actions-at-a-distance, the existence of all-powerful invisible elves is pretty hard to disprove. :smile:

vanesch said:
In fact, that's not surprising either. If you have action at a distance, (if A can cause B or vice versa) then all possible correlations are allowed, and there's no "Bell theorem" being contradicted. What's the problem with the EPR kind of setups is that the "causes" (the choices of measurement one makes) are space-like separated events, so action-at-a-distance would screw up with relativity. So people (like me) sticking to relativity refuse to consider that option. But it is a genuine option: it is actually by far the "most common sense" one.
I disagree. The most common sense option is common cause(s). Call it a working hypothesis -- one that has the advantage of not being at odds with relativity. You've already acknowledged that common cause is an option, just not normal common causes. Well, the submicroscopic behavior of light is a pretty mysterious subject, don't you think? Maybe the classical models of light are (necessarily?) incomplete enough so that a general and conclusive lhv explanation of Bell tests isn't (and maybe will never be) forthcoming.
 
  • #85


A shift of angle on this maybe in case folks are getting a little over heated...

If the entangled particles are modeled as the tensor product of two Hilbert Spaces
then the result is a combined wave function for two particles that behave as one
wave function.

But then we ask questions about the large euclidean separation between the particles
and how one particle 'knows' about the other's state (eg hidden variable). This seems an inconsistent question because there is only one wave function (albeit for two particles) and when something happens to the wave it would be instantaneous everywhere at once.

An additional helping analogy would be to consider a single wave packet for an electron or photon - Youngs Slits or similar.
We don't ask questions about the 'speed of probablities' between one end of the wave packet and the other - its all instant. There is no 'time separation' about where the particle statistically reveals itself. Similarly with entangle particles. Or am I barking up a wrong tree?
 
  • #86


DrChinese said:
You are a challenging kind of guy... :)

All you need to do to explain the Bell Inequality violation is to say that particles do NOT have well-defined non-commuting attributes when not being observed. You deny then that there is an A, B and C in Bell's [14], as previously mentioned. (Denying this assumption is completely in keeping with the HUP anyway, even if not strictly required by it.)

And there is experimental evidence as well, but you likely don't think it applies. There is the work of Groeblacher et al which I am sure you know. I accept these experiments as valid evidence but think more work needs to be done before it is considered "iron clad".

So, there is a mathematical apparatus already: QM and relativity. So dumping non-locality as an option does not require anything new. The reason I think non-realism is appealling is because it - in effect - elevates the HUP but does not require new forces, mechanisms, etc. We can also keep c as a speed limit, and don't need to explain why most effects respect c but entanglement does not.

By the way, you might have been a bit harsh on ThomasT. We each have our own (biased?) viewpoints on some of these issues, including you. Clearly, you reject evidence against non-locality and reject non-realism as a viable alternative.


Haha good one about my being challenging.

About your non-realist locality definition, what do you do about collapse of the measurement settings to definite values? What causes it and when does it happen? What is your mathematical description of that process.

I don't think it was harsh. There is a big difference between rejecting what is unambiguously wrong from the POV of Bell's theorem, and being very skeptical about another possbility (non-realist locality) which you also admit still has to be worked out. Also, as I said, I don't actually think the nonlocality explanation is necessarily the best one. In fact I am much more inclined to think the causality assumption is the more unphysical assumption that must be given up, rather than locality. But that is clearly implied by Bell's theorem.
 
  • #87


LaserMind said:
We don't ask questions about the 'speed of probablities' between one end of the wave packet and the other - its all instant. There is no 'time separation' about where the particle statistically reveals itself. Similarly with entangle particles. Or am I barking up a wrong tree?

This is the collapse of the wavefunction, and I think this manifests itself identically whether we are talking about 1 particle or a pair of entangled particles.

Any single photon (say emitted from an electron) has a chance of going anywhere and being absorbed. The odds of it being absorbed at Alice's detector are A, and the odds of it being absorbed at Bob's detector is B. And so on for any number of possible targets, some of which could be light years away. When we observe it at Alice, that means it is NOT at Bob or any of the other targets. Yet clearly there was a wave packet moving through space - that's what experiments like the Double Slit show, because there is interference from the various possible paths. And yet there we are at the end, the photon is detected in one and only one spot. And the odds collapse to zero at everywhere else. And that collapse would be instantaneous as best as I can tell.

So this is analogous to the mysterious nature of entanglement, yet I don't think it is really any different. Except that entanglement involves an ensemble of particles.
 
  • #88


Maaneli said:
Haha good one about my being challenging.

About your non-realist locality definition, what do you do about collapse of the measurement settings to definite values? What causes it and when does it happen? What is your mathematical description of that process.

I don't have a physical explanation for instantaneous collapse. I think this is a weak point in QM. But I think the mathematical apparatus is already there in the standard model.

By the way, challenging is good as far as I am concerned. Not sure I am up to too many though. :)
 
  • #89


MWI is a perfect example of locality and non-realism. There is no objective state that other parts of your world are in until you interact with them and go through decoherence.
 
  • #90


DrChinese said:
I don't have a physical explanation for instantaneous collapse. I think this is a weak point in QM. But I think the mathematical apparatus is already there in the standard model.

By the way, challenging is good as far as I am concerned. Not sure I am up to too many though. :)


<< I don't have a physical explanation for instantaneous collapse. I think this is a weak point in QM. >>

Well, let's distinguish textbook QM mathematics and measurement postulates from the interpretation of it all as anti-realist. Certainly the ad-hoc, mathematically and physically vague measurement postulates are a weak point of textbook QM. But if your anti-realist interpretation has the same basic problem, then it cannot be a true physical theory of QM measurement processes, or a potentially fundamental physical interpretation of QM. That's why I still have not yet seen any coherent physical/mathematical definition of "anti-realist locality".


<< But I think the mathematical apparatus is already there in the standard model. >>

Ah but that's the thing. Standard QM has only postulates - no mathematical apparatus for treating measurement processes! Not even adding decoherence theory does the job fully! And anyway decoherence theory implies realism. That's why I think that if you don't want to invoke additional (hidden) variables to QM, and want to keep only with the wavefunction and HUP, and not try to analyze measurement processes, the only self-consistent interpretation is Ballentine's statistical interpretation of QM - but even that can only be a temporary filler to a more complete description of QM and, inevitably, a beable theory of QM.


<< By the way, challenging is good as far as I am concerned. Not sure I am up to too many though. :) >>

Glad you think so!
 
Last edited:
  • #91


BTW, contrary to what some people say, MWI is not an example of locality and nonrealism. That's a bad misconception that MWI supporters like Vanesch (I suspect), or even Tegmark, Wallace, Saunders, Brown, etc., would object to.
 
  • #92


I don't understand the fuss about "instantaneous collapse". If you consider an entangled state like singlet two spin state:

|+-> - |-+>

Then when you measure the spin of one particle, your wavefunction get's entangled with the spin. So, if that's what we call collapse and that's when information is transferred to us (in each of our branches), then information about spin 1 was already present in spin 2 and vice versa when the entangled 2-spin state was created.
 
  • #93


Count Iblis said:
I don't understand the fuss about "instantaneous collapse". If you consider an entangled state like singlet two spin state:

|+-> - |-+>

Then when you measure the spin of one particle, your wavefunction get's entangled with the spin. So, if that's what we call collapse and that's when information is transferred to us (in each of our branches), then information about spin 1 was already present in spin 2 and vice versa when the entangled 2-spin state was created.



<< Then when you measure the spin of one particle, your wavefunction get's entangled with the spin. >>

This sentence makes no sense. The wavefunctions of the two "particles" (if you're just talking textbook QM) are spinor-valued, and therefore already contain spin, and when they are in the singlet state, they are already entangled in configuration space (by definition!). When you "measure" the spin of one particle, you "collapse" the entangled spin states of the two "particles" to a definite spin outcome, and they are therefore no longer entangled.
 
  • #94


Maaneli said:
When you "measure" the spin of one particle, you "collapse" the entangled spin states of the two "particles" to a definite spin outcome, and they are therefore no longer entangled.
Here's an intuitive view to explain the quantum postulates that we are using (IMHO):
When something or someone demands an answer
to a state vector (as an observable) by collapsing
the wave function and observing it (say on a screen)
then the Universe is forced to give an answer whether
it has one or not - the Universe cannot reply 'sorry I don't know
where the particle is, actually, I haven't got one, but you are demanding
it - so I'll have to make a guess for you, I've no other choice because of
your clunky apparatus and strange question is forcing me to answer'
It must answer our strange question. The only sensible answer it
can give is a statistical one because any other answer would be
wrong.
 
  • #95


ThomasT said:
Sorry if I misparaphrased you, because you've helped a lot in elucidating these issues.

There is a classical situation which I think analogizes what is happening in optical Bell tests -- the polariscope. The measurement of intensity by the detector behind the analyzing polarizer in a polariscopic setup is analogous to the measurement of rate of coincidental detection in simple optical Bell setups. Extending between the two polarizers in a polariscopic setup is a singular sort of optical disturbance. That is, the disturbance that is transmitted by the first polarizer is identical to the disturbance that's incident on the analyzing polarizer. In an optical Bell setup, it's assumed that for a given emitted pair the disturbance incident on the polarizer at A is identical to the disturbance that's incident on the polarizer at B. Interestingly enough, both these setups produce a cos^2 functional relationship between changes in the angular difference of the crossed polarizers and changes in the intensity (polariscope) or rate of coincidence (Bell test) of the detected light.

Yes, but there's a world of difference. The light disturbance that reaches the second polarizer has undergone the measurement process of the first, and in fact has been altered by the first. As such, it is in a way not surprising that the result of the second polarizer is dependent on the *choice of measurement* (and hence on the specific alteration) of the first. The correlation is indeed given by the same formula, cos^(angular difference), but that shouldn't be surprising in this case. The result of the second polarizer is in fact ONLY dependent on the state of the first polarizer: you can almost see the first polarizer as a SOURCE for the second one. So there is the evident possibility of a causal relation between "choice of angle of first polarizer" and "result of second polarizer".

What is much more surprising - in fact it is the whole mystery - in an EPR setup, is that two different particles (which may or may not have identical or correlated properties) are sent off to two remote experimental sites. As such there can of course be a correlation in the results of the two measurements, but these results shouldn't depend on the explicit choice made by one or other experimenter if we exclude action-at-a-distance. In other words, the two measurements done by the two experimenters "should" be just statistical measurements on a "set of common properties" which are shared by the two particles (because of course they have a common source). And it is THIS kind of correlation which should obey Bell's theorem (statistical correlations of measurements of common properties) and it doesn't.

And yet it's a "common cause" (certainly not of the ordinary kind though) assumption that underlies the construction and application of the quantum mechanical models that pertain to the Bell tests, as well as the preparation and administration of the actual experiments.

Yes, but now it is up to you what you understand by common cause, but not of the ordinary kind. Because the "ordinary kind" includes all kinds of "common properties" (identical copies of datasets). So whatever is not the ordinary kind, it's going to be "very not ordinary".

OK, the correlations are due to a unusual sorts of common causes then. This is actually easier to almost visualize in the experiments where they impart a similar torque to relatively large groups of atoms. The entire, separate groups are then entangled with respect to their common zapping. :smile: Or, isn't this the way you'd view these sorts of experiments?

Well, how do you visualize "these non-ordinary" common causes ? About every mental picture you can think off, falls in the class of "ordinary" common causes, which should respect Bell's theorem.

The problem with instantaneous-action-at-a-distance is that it's physically meaningless. An all-powerful invisible elf would solve the problem too. Just like instantaneous-actions-at-a-distance, the existence of all-powerful invisible elves is pretty hard to disprove. :smile:

I agree. Nevertheless, Newtonian gravity is "action at a distance", but indeed, it opens up the gate for arbitrary explanations, of the astrology kind, of about any phenomenon. It's yet less of a problem than superdeterminism, which means the end of science, though.

Nevertheless, I agree with you, and it is the fundamental difficulty I have with Bohmian mechanics, which would otherwise have been the best explanation for quantum phenomena. But from the moment, indeed, that the motion of an arbitrarily distant particle can induce an arbitrarily large force on a local particle here, "all bets are off".

I disagree. The most common sense option is common cause(s). Call it a working hypothesis -- one that has the advantage of not being at odds with relativity. You've already acknowledged that common cause is an option, just not normal common causes. Well, the submicroscopic behavior of light is a pretty mysterious subject, don't you think? Maybe the classical models of light are (necessarily?) incomplete enough so that a general and conclusive lhv explanation of Bell tests isn't (and maybe will never be) forthcoming.

No, it won't do. All "common sense" common causes are of the "ordinary" kind. So saying that it must be a common sense, but "non-ordinary" common cause is not going to help us.

I will tell you how *I* picture this (but I won't do this for too long, as I have done this at least already a dozen times on this forum). After all, we're not confronted with an *unexpected* phenomenon. We're verifying predictions of quantum theory ! So what's the best way of at least *picture* what happens ? Answer: look at quantum theory itself, which predicts this ! You can obtain the results of an Aspect-like experiment using quantum theory, and purely local interactions (the ones we use normally, such as electrodynamics). You just let the wave-function evolve! And then you see that you get different observer states, which have seen different things, but *when they come together* they separate in the right branches with the right probabilities - which are nothing else but the observed correlations. That's nothing else but "many worlds". It solves the dilemma of the "correlations-at-a-distance" simply by stating that those correlations didn't happen "at the moment of measurement" which simply created both possible outcomes, but the correlations happened when the observers came together to compare their outcomes. In fact, all different versions of the observers came together to compare all their different possible sets of outcomes, and those that are most probable (those with the largest hilbert norm) are simply those with the right correlations from QM predictions.

Of course, now you have the weirdness of multiple worlds, but at least, you have a clear picture of how the theory that correctly predicts the "incomprehensible outcomes" comes itself to those outcomes.

I've worked this out several times here, I won't type all that stuff again.
 
  • #96


Maaneli said:
BTW, contrary to what some people say, MWI is not an example of locality and nonrealism. That's a bad misconception that MWI supporters like Vanesch (I suspect), or even Tegmark, Wallace, Saunders, Brown, etc., would object to.

The problem lies in the word "non-realism" and then the right definition of "local". There are some papers out there that show that you can see unitary wavefunction evolution as a local process (as long as the implemented dynamics - the interactions - are local of course), although that's better seen in the Heisenberg picture. I'm too lazy to look up the arxiv articles.
So MWI can be seen as respecting locality in a way. That's not surprising given that unitary evolution respects lorentz invariance (if the dynamics does so).

As to "realism", instead of calling it "non-realist", I'd rather call it "multi-realist". But that's semantics. The way MWI can get away with Bell is simply that at the moment of "measurement" at each side, there's no "single outcome", but rather both outcomes appear. It is only later, when the correlations are established, and hence when there is a local interaction between the observers that came together, that the actual correlations show up.
 
  • #97


It really is *relative* state, just like the first paper called it. There's no objective state of any particle before observation that everyone will agree on, so there's no one "true" reality.

Are there any other interpretations that preserve locality?
 
  • #98


Maaneli said:
<< Then when you measure the spin of one particle, your wavefunction get's entangled with the spin. >>

This sentence makes no sense. The wavefunctions of the two "particles" (if you're just talking textbook QM) are spinor-valued, and therefore already contain spin, and when they are in the singlet state, they are already entangled in configuration space (by definition!). When you "measure" the spin of one particle, you "collapse" the entangled spin states of the two "particles" to a definite spin outcome, and they are therefore no longer entangled.

In the MWI, there is no collapse, the wavefunction of the observer gets entangled with the two spin state. I think that the "paradox" implied by instantaneous collapse is just an artifact of assuming that the observer collapses the wavefunction, while in reality this is an effective description.
 
  • #99


If wavefunction collapse really happens, then that should be confirmed by experiments testing for violations of unitarity. Unitarity could perhaps be spontaneously broken as has been suggested in some recent publications...
 
  • #100


vanesch said:
Nevertheless, Newtonian gravity is "action at a distance", but indeed, it opens up the gate for arbitrary explanations, of the astrology kind, of about any phenomenon. It's yet less of a problem than superdeterminism, which means the end of science, though.

I think you should read ’t Hooft's paper:

http://arxiv.org/PS_cache/quant-ph/pdf/0701/0701097v1.pdf"

He replaces the poorly defined, if not logically absurd notion of "free-will" with the unconstrained initial state" assumption. This way, all those (IMHO very weak, anyway) arguments against superdeterminism should be dropped.
 
Last edited by a moderator:
Back
Top