Entanglement spooky action at a distance

  • Thread starter Thread starter Dragonfall
  • Start date Start date
  • Tags Tags
    Entanglement
Click For Summary
Entanglement, often referred to as "spooky action at a distance," is explained through Bell's Theorem, which shows that the outcomes of measurements on entangled particles are correlated in a way that defies the notion of independent randomness. The correlation follows a specific formula supported by experimental evidence, rejecting simpler models that suggest random outcomes. The design of EPR-Bell tests involves synchronized detection events that create interdependencies between measurements, which do not imply faster-than-light (FTL) communication. Discussions also highlight that the correlations arise from shared properties at the quantum level rather than any FTL influence or random pairing. Overall, the consensus is that entanglement does not facilitate instantaneous information transfer, as no physical evidence supports FTL transmissions.
  • #61


In order to reject a theory based on Bell's theorem alone, that theory should have the property that the events in one part of the experimental setup (source, detector 1, detector 2) should not depend on the other parts (the statistical independence assumption).

The only theories that satisfy this assumption (and are therefore unable to reproduce QM's predictions) are the "billiard ball"-type ones (no long range force, interactions only at direct collisions). Incidentally, Maxwell's theory of electromagnetism, Newtonian gravity, or Einstein's GR all have long range forces therefore the statistical independence assumption is denied. Therefore, a modification of maxwell's theory, while remaining local and realistic could in principle reproduce QM's predictions.

So, the saying that local realism is excluded by Bell's theorem is patently false.
 
Physics news on Phys.org
  • #62


Maaneli said:
1. Yes, this was exactly my point. I think you misunderstood me before. Indeed the form of realism you generally suggest is an absolutely necessary pin in the logic of the theorem

2. Later people like O. Costa de Beauregard, Huw Price, and others since have advanced the idea of using backwards causation to save locality and show how Bell and GHZ inequalities could be violated. Price discusses this at length in his book

"Time's Arrow and Archimedes Point"
http://www.usyd.edu.au/time/price/TAAP.html

and his papers:

Backward causation, hidden variables, and the meaning of completeness. PRAMANA - Journal of Physics (Indian Academy of Sciences), 56(2001) 199—209.
http://www.usyd.edu.au/time/price/preprints/QT7.pdf

Time symmetry in microphysics. Philosophy of Science 64(1997) S235-244.
http://www.usyd.edu.au/time/price/preprints/PSA96.html

Toy models for retrocausality. Forthcoming in Studies in History and Philosophy of Modern Physics, 39(2008).
http://arxiv.org/abs/0802.3230

You may also be interested to know that there exists a deBB model developed by Sutherland that implements backwards causation, is completely local, and reproduces the empirical predictions of standard QM:

Causally Symmetric Bohm Model
Authors: Rod Sutherland
http://arxiv.org/abs/quant-ph/0601095
http://www.usyd.edu.au/time/conferences/qm2005.htm#sutherland
http://www.usyd.edu.au/time/people/sutherland.htm

and his older work:

Sutherland R.I., 'A Corollary to Bell's Theorem', Il Nuovo Cimento B 88, 114-18 (1985).

Sutherland R.I., 'Bell's Theorem and Backwards-in-Time Causality', International Journal of Theoretical Physics 22, 377-84 (1983).

And just to emphasize, all these backwards causation models involve some form of realism.

3. Whether your viewpoint is "mainstream" (and you still have to define what "mainstream" means to make it meaningful) or not is completely irrelevant. All that is relevant is the logical validity and factual accuracy of your understanding of these issues. But, I could tell you that among QM foundations specialists, such as people who participate in the annual APS conference on foundations of physics (which I have done so for the past 3 consecutive years):

New Directions in the Foundations of Physics
American Center for Physics, College Park, April 25 - 27, 2008
http://carnap.umd.edu/philphysics/conference.html

your opinion is quite the minority. Furthermore, I didn't imply that locality isn't embedded in Bell's theorem or that realism isn't embedded in Bell's theorem. I just said that the crucial conclusion of Bell's theorem (and Bell's own explicitly stated conclusion) is that QM is not a locally causal theory, not that it is not a locally real theory, whatever that would mean.

4. Let me also emphasize that unlike what you seem to be doing in characterizing Bell's theorem as a refutation of realism, Zeilinger acknolwedges that nonlocal hidden variable theories like deBB are compatible with experiments, even if he himself is an 'anti-realist'. By the way, anti-realists such as yourself or Zeilinger still have the challenge to come up with a solution to the measurement problem and derive the quantum-classical limit. Please don't try to invoke decoherence, since the major developers and proponents of decoherence theory like Zurek, Zeh, Joos, etc., are actually realists themselves - and even they admit that decoherence theory has not and probably will never on its own solves the measurement problem or account for the quantum-classical limit. On the other hand, it is well acknolwedged that nonlocal realist theories like deBB plus decoherence do already solve the problem of measurement and already accurately (even if not yet perfectly) describe the quantum-classical limit. So by my assessment, it is the anti-realist crowd that is in the minority and has much to prove.

1. We agree on this point, and that was my issue.

2. Thank you for these references, there are a couple I am not familiar with and would like to study.

3. The issue with "mainstream" is that mainstream theory can be wrong - of course - but I think it is helpful for most folks to learn the mainstream before they reject it.

I see your point that there is a more diverse group out there, and so maybe the idea of "mainstream" is too broad to be so easily characterized. At any rate, I was not trying to say that the "anti-realism" view was the mainstream. I was trying to say that the mainstream view is that local realistic theory is not viable.

4. Repeating that I was not trying to advance the cause of "non-realism" other than showing it is one possibility. I agree that non-local solutions should be viable. In a lot of ways, they make more intuitive sense than non-realism anyway.

BTW, my point about GHZ was not that it proved non-realism over non-locality. It is another of the no-go proofs - of which there are several - which focus on the realism assumption. These proofs are taken in different ways by the community. Since we don't disagree on the main point, we can drop this particular sidebar.
 
  • #63


ueit said:
In order to reject a theory based on Bell's theorem alone, that theory should have the property that the events in one part of the experimental setup (source, detector 1, detector 2) should not depend on the other parts (the statistical independence assumption).

The only theories that satisfy this assumption (and are therefore unable to reproduce QM's predictions) are the "billiard ball"-type ones (no long range force, interactions only at direct collisions). Incidentally, Maxwell's theory of electromagnetism, Newtonian gravity, or Einstein's GR all have long range forces therefore the statistical independence assumption is denied. Therefore, a modification of maxwell's theory, while remaining local and realistic could in principle reproduce QM's predictions.

So, the saying that local realism is excluded by Bell's theorem is patently false.

I seem to be stuck in the middle again... :)

There is no viable local realistic theory on the table to discuss at this point. You don't have one to offer, and "heroic" efforts by Santos and others (with varations on stochastic ideas) have so far fallen well short of convincing much of anyone. Bell's Theorem shows us how to dissect and attack such attempts. So I strongly disagree.
 
  • #64


Maaneli said:
IAgainst `Realism'
Authors: Travis Norsen
Foundations of Physics, Vol. 37 No. 3, 311-340 (March 2007)
http://arxiv.org/abs/quant-ph/0607057

By the way, it might surprise you (and Travis for that matter) to learn that I have had a link to another of his earlier papers - somewhat similar to your citation - on my website for nearly 3 years:

Travis Norsen: EPR and Bell Locality , arXiv (2005)

...So please don't think that I limit my viewpoints. I respect differences of opinion and think they are healthy. But I also think that on this board, opinions should be distinguished from mainstream thought for the sake of those who don't follow things to the Nth degree.
 
  • #65


Thought experiment. Suppose we simulate on a classical computer (the bits are manipulated using local deterministic rules) a world described by quantum mechanics. In this world, observers live who can do experiments and verify that Bell's inequality is violated in exactly the way as predicted by QM. Nevertheless the world they live in, is ultimately described by the rules according to which the bits in the computer are manipulated.
 
  • #66


Maaneli said:
I have a hard time understanding how Pagel could possibly have reached that conclusion. Indeed it even contradicts Bell's own conclusions. It looks confused. But, can you give me the reference?
"The Cosmic Code: quantum physics as the language of nature"

Maaneli said:
<< So, is an understanding of the entangled data (correlations) produced in EPR-Bell tests in terms of a common cause produced at emission possible?>>
No. But, as I said earlier, there is the common past hypothesis (that the emission and detection events share a common past) that is logically possible, although extremely implausible. Bell talks about this in his paper "Free Variables and Local Causality". More plausible and successful have been the nonlocal explanations, as well as the causally symmetric explanations.
Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations.

Maaneli said:
If you would like a classical analogue of Bell's inequality and theorem, read the first chapter of Tim Maudlin's book. He gives a perfectly clear and accurate classical analogue.
I take it you didn't like my polariscope analogy? (I really thought I had something there. :smile:)
I read what I could of Maudlin's first chapter at Google books. Nothing new or especially insightful there. I've read Price's book -- didn't like it. But thanks for the references and nice discussion with DrChinese et al.
I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed.
 
  • #67


Interesting idea Count. I did something like this (I was a computer science guy in a previous life).

It is impossible using a standard - non-quantum - computer to simulate the results of EPRB experiments without utilizng both polarizer settings in calculating the odds of any particular photon passing.

The "Does Photon Pass Polarizer x" function simply cannot be written without reference to the other polarizer while still obtaining the quantum results.

If you try to do something elaborate - say, in the "generate entangled photons" function, you pre-program both of them for every conceiveable polarizer angle - you come close to the quantum results, but not perfectly.

In order to reproduce the quantum results, you have to either:
1) allow the two photons to "know" the polarizer settings before they've reached them (some kind of superdeterminism) and agree ahead of time on how they're going to behave; or
2) check to see whether the twin has reached its polarizer yet; if not, just go with 50/50. If it has reached, behave in a complimentary way (non-locaity).

The third option would be some kind of many-world simulation where we let objects continue to evolve in super-position until someone observes both but I thought that a little too complicated to code.
 
  • #68
Bell's Inequality

peter0302 said:
Each time Alice eats a tomato, Bob is more likely to eat a cucumber. Each time Alice can't finish her broccoli, Bob eats his carrots more often.
QUOTE]

I have read this thread with great interest and marvelled at the logic above - why should Bob eat a cucumber when Alice eats a tomato?

So I have done a 'plastic balls' version of Bell's Inequality in what I hope is the simplest possible depiction.
I need to add a jpg of the quantum violation of Bells Inequality, but cannot see quite how to do it.

Can someone offer advice? How could I devise a plastic Balls jpg of the QM version of events?
http://www.ronsit.co.uk/weird_at_Heart.asp
 
Last edited by a moderator:
  • #69


Maaneli said:
I have a hard time understanding how Pagel could possibly have reached that conclusion.
In case you haven't had a chance to check out Pagel's book, I can summarize his argument.

Nonlocality has to do with the spatially separated setups producing changes in each other via spacelike separated events.

Pagel's argument against nonlocality (wrt EPR-Bell tests at least) hinges on the randomness of the individual results. Quantitatively, we know that A is not producing changes in B and vice versa. Qualitatively, there's no way to know. The individual probabilities at one end remain the same no matter what happens at the other end. Speaking of the conditional probability at B given a detection at A is meaningless. The probabilities only have physical meaning wrt the accumulation of results of large numbers of individual trials. Because of the randomness of individual data sequences nonlocality in EPR-Bell tests can't be conclusively established.

If the sequences from A and B are matched appropriately, then information about changes in the settings of the separated polarizers is in the cross-correlation. In effect, the global experimental design yields the quantum correlations -- which, in my view, is what would be expected if the deep cause of the entanglement is via common emission, or interaction, or transmission of a common torque, etc. (My polariscope analogy comes in handy here I think.)

Apparently, the only thing preventing a consensus wrt the common origin explanation for the correlations is that Bell inequalitities are interpreted to exclude the possibility that the filters at A and B might have been filtering identical incident disturbances for any given pair of detection attributes.
 
  • #70


peter0302 said:
We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.
This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? I think this is possible, maybe even likely, and, if so, it would seem to reinforce the Copenhagen approach to interpreting the formalism and application of the quantum theory. (ie., we can't possibly know the truth of a deep quantum reality, so there's no scientific point in talking about it)
 
  • #71


ThomasT said:
This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? I think this is possible, maybe even likely, and, if so, it would seem to reinforce the Copenhagen approach to interpreting the formalism and application of the quantum theory. (ie., we can't possibly know the truth of a deep quantum reality, so there's no scientific point in talking about it)
Oh absolutely. Don't get me wrong, I'd love a physical explanation as well, or proof that none is possible. But the starting point has to be the math, not the philosophy. A lot of these interpretive endeavors tend to drift a long way from science.

At the risk of opening another flame war, this is the reason I prefer MWI because it throws out assumptions that are necessitated only by our subjective perceptions (wavefunction collapse) rather than by objective evidence. That should be the starting point. Then let's find whee it leads.
 
  • #72


Originally Posted by peter0302:

We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.

--------------------------------------------------------

I agree with the above. And we should all recall that EPR started the debate with their terms and definitions, especially that there should be a "more complete" specification of the system possible (or else the reality of one particle would be dependent of the nature of the measurement done on another). It does not appear that a more complete specification of the system is possible regardless of what assumption you end up rejecting.
 
  • #73


Chinese,

It does not appear that a more complete specification of the system is possible regardless of what assumption you end up rejecting.


That's blatantly false (if I understand you correctly), as deBB and GRW and stochastic mechanical theories have proven. You may not like these more complete specifications of the system for various philosophical reasons, but it is dishonest to deny that they exist and are empirically equivalent to the standard formalism.
 
  • #74


DrChinese said:
Originally Posted by peter0302:

We have to see Bell's inequality for what it is: the consequence of an assumption which Aspect and others have proven wrong. While we all agree on what that asusmption is mathematically, we can't agree on what it means physically. But at the very least, we should be focusing on the assumption, and not any author's (including Bell's own) editorial comments or beliefs regarding it.

In fairness, the boldface words are actually not true either. Aspect's original experiments were heavily flawed with various loopholes, and it was quite easy to account for those results with locally causal hidden variable models. Also, not even Zeilinger or Kwiat would claim the Bell tests are conclusive today. Because they acknowledge that no experiment has yet been done that simultaneously closes the detection efficiency loophole AND the separability loophole AND cannot be equally well explained by LCHV models like Santos-Marshall stochastic optics and Fine-Maudlin Prism models of GHZ correlations.
 
Last edited:
  • #75


peter0302 said:
I'd love a physical explanation as well, or proof that none is possible. But the starting point has to be the math, not the philosophy. A lot of these interpretive endeavors tend to drift a long way from science.
Agreed. But I think it's worth the effort to sort out the semantics of the interpretations.
peter0302 said:
At the risk of opening another flame war, this is the reason I prefer MWI because it throws out assumptions that are necessitated only by our subjective perceptions (wavefunction collapse) rather than by objective evidence. That should be the starting point. Then let's find whee it leads.
Wavefunction collapse or reduction is the objective dropping of terms that don't correspond to a recorded experimental result. That is, once a qualitative result is recorded, then the wavefunction that defined the experimental situation prior to that is reduced to the specification of the recorded result.

If one reifies the wavefunction, then one is saddled with all sorts of (in my view) unnecessary baggage -- including, possibly, adherance to MWI, or MMI, or some such interpretation. :smile:
 
  • #76


DrChinese said:
I agree with the above. And we should all recall that EPR started the debate with their terms and definitions, especially that there should be a "more complete" specification of the system possible (or else the reality of one particle would be dependent of the nature of the measurement done on another). It does not appear that a more complete specification of the system is possible regardless of what assumption you end up rejecting.
Exactly. And so instead, what we have are interpretations which are not "more complete" so much as they are "more complicated" - since they (as of yet) make no different or more accurate predictions than the orthodox model EPR criticized.

Not that I don't think the research is worthwhile. I certainly do. I'm still hopeful there is something more complete, but I doubt any of the interpretations we have now (at least in their current forms) are going to wind up winning in the end.

Wavefunction collapse or reduction is the objective dropping of terms that don't correspond to a recorded experimental result. That is, once a qualitative result is recorded, then the wavefunction that defined the experimental situation prior to that is reduced to the specification of the recorded result.
Yes, but what's your physical (objective, real) justification for doing so? Plus, it's not defined objectively in Bohr's QM. It's done differently from experiment to experiment, and no one really agrees whether a cat can do it or not, let alone how.

Were you not the one who wanted a physical explanation? :)
 
  • #77


DrChinese said:
1. We agree on this point, and that was my issue.

2. Thank you for these references, there are a couple I am not familiar with and would like to study.

4. Repeating that I was not trying to advance the cause of "non-realism" other than showing it is one possibility. I agree that non-local solutions should be viable. In a lot of ways, they make more intuitive sense than non-realism anyway.

BTW, my point about GHZ was not that it proved non-realism over non-locality. It is another of the no-go proofs - of which there are several - which focus on the realism assumption. These proofs are taken in different ways by the community. Since we don't disagree on the main point, we can drop this particular sidebar.



1. OK.

2. You're welcome.

4. I have not yet seen any evidence that local non-realism is a viable explanation of Bell inequality violations. I challenge you to come up with a mathematical definition of non-realist locality. And I challenge you to come up with a measurement theory based on solipsism that solves the measurement problem, and allows you to completely derive the quantum-classical limit.

About GHZ, you said it is another no-go proof that focuses on the realism assumption. That's just not true. It focuses just as much on locality and causality as Bell's theorem does.
 
  • #78


ThomasT said:
"The Cosmic Code: quantum physics as the language of nature"


Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations.


I take it you didn't like my polariscope analogy? (I really thought I had something there. :smile:)
I read what I could of Maudlin's first chapter at Google books. Nothing new or especially insightful there. I've read Price's book -- didn't like it. But thanks for the references and nice discussion with DrChinese et al.
I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed.


Thanks for the references.

<< Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations. >>

At first I thought you meant the experimental designs lend themselves to detection loopholes and such. But then you say

<< This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? >>

So you clearly believe that Aspect and others have confirmed Bell inequality violations. You also say that

<< I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed. >>

So, honestly, it just sounds to me like you have refused to understand Bell's theorem for what it is, or have been told what it is, and are just in denial about it. In that case, no one can help you other than to say that you seem to be letting your subjective, intuitive, biases prevent you from learning about this subject. And that won't get you anywhere.
 
  • #79


peter0302 said:
Interesting idea Count. I did something like this (I was a computer science guy in a previous life).

It is impossible using a standard - non-quantum - computer to simulate the results of EPRB experiments without utilizng both polarizer settings in calculating the odds of any particular photon passing.

The "Does Photon Pass Polarizer x" function simply cannot be written without reference to the other polarizer while still obtaining the quantum results.

If you try to do something elaborate - say, in the "generate entangled photons" function, you pre-program both of them for every conceiveable polarizer angle - you come close to the quantum results, but not perfectly.

In order to reproduce the quantum results, you have to either:
1) allow the two photons to "know" the polarizer settings before they've reached them (some kind of superdeterminism) and agree ahead of time on how they're going to behave; or
2) check to see whether the twin has reached its polarizer yet; if not, just go with 50/50. If it has reached, behave in a complimentary way (non-locaity).

The third option would be some kind of many-world simulation where we let objects continue to evolve in super-position until someone observes both but I thought that a little too complicated to code.

Yes, the code when interpreted as describing a classical world will describe it as being non-local. But when you run the computer, the internal state of the computer will evolve in a local deterministic way.
 
  • #80


Well, sure. Is your point that we could be living in a computer simulation?

Even if so, locality is a supposed rule of the simulation. If the simulation is comparing both polarizer angles, that's cheating. :)
 
  • #81


I saw this rather late, sorry for my late response...

ThomasT said:
vanesch says that violations of Bell inequalities mean that the incident disturbances associated with paired detection attributes cannot have a common origin. This would seem to mean that being emitted from the same atom at the same time does not impart to the opposite-moving disturbances identical properties.

What I said was that the correlations as seen in an Aspect-like experiment, and as predicted by quantum theory, cannot be obtained by "looking simply at a common cause". That is, you cannot set up a classical situation where you look at a common property of something, and obtain the same correlations as those from quantum mechanics. This is because classically, we only know of two ways to have statistical correlations C(A,B): A causes B (or B causes A) is one way, and A and B have a common cause C is the other effect. Example of the first:
- A is "setting of the switch" and B is "the light is on"
clearly, because setting the switch causes the light to be on or off, we will find a correlation between both.
Example of the second:
- you drive a Ferrari and you have a Rolex.
It is not true that driving Ferrari's makes you have a Rolex, or that putting on a Rolex makes you drive a Ferrari. So there's not "A causes B" or "B causes A". However, being extremely rich can cause you to buy a Ferrari as well as a Rolex. So there was a common cause: "being rich".

Well, Bell's theorem is a property that holds for the second kind of correlations.

So the violation of that theorem by observed or quantum-mechanically predicted correlations means that it cannot be a correlation that can be explained entirely by "common cause".

And yet, in the hallmark 1984 Aspect experiment using time-varying analyzers, experimenters were very careful to ensure that they were pairing detection attributes associated with photons emitted simultaneously by the same atom.

Yes, of course. That's because we have to look for *entangled* photons, which NEED to have a common source. But the violation of Bell's theorem simply means that it is not a "common cause of the normal kind", like in "being rich".

I was remembering last night something written by the late Heinz Pagels about Bell's theorem where he concludes that nonlocality (ie. FTL transmissions) can't be what is producing the correlations.

That's wrong: Bohmian mechanics explicitly shows how action-at-a-distance can solve the issue. In fact, that's not surprising either. If you have action at a distance, (if A can cause B or vice versa) then all possible correlations are allowed, and there's no "Bell theorem" being contradicted. What's the problem with the EPR kind of setups is that the "causes" (the choices of measurement one makes) are space-like separated events, so action-at-a-distance would screw up with relativity. So people (like me) sticking to relativity refuse to consider that option. But it is a genuine option: it is actually by far the "most common sense" one.
 
  • #82


Maaneli said:
Thanks for the references.

<< Since the experimental designs seem to have (in my view anyway) common emission cause written all over them (and since the whole topic is open to speculation) I would rank that as more plausible than any of the other more exotic explanations for the correlations. >>

At first I thought you meant the experimental designs lend themselves to detection loopholes and such.
No, I had the emission preparations and the data matching mechanisms in mind. So, even if all the loopholes were conclusively closed and the inequalities were conclusively violated experimentally, I'd still think that the experimental designs have common emission cause written all over them.

Maaneli said:
But then you say

<< This makes sense to me. Nevertheless, it would be nice to know if experimental violations of Bell inequalities have any physical meaning -- and, if so, what. Might it be that there's no way to ascertain what the physical meaning of an EPR-Bell experiment is? >>

So you clearly believe that Aspect and others have confirmed Bell inequality violations.

Well, I did believe that, but now I suppose I'll have to look a bit closer at the loopholes you're referred to.

But it won't matter if the loopholes are all closed and the violations are conclusive. One can talk all one wants to about nonlocality, however, like, say, a seamless, nonparticulate medium that can't possibly be detected -- what would be the point?

Maaneli said:
You also say that

<< I don't like the nonlocal explanations. Too easy. I'll continue, for the time being, working under the assumption that something (or things) about the physical meaning of Bell's theorem and Bell inequalities is being misinterpreted or missed. >>

So, honestly, it just sounds to me like you have refused to understand Bell's theorem for what it is, or have been told what it is, and are just in denial about it.

I think that those who understand Bell's theorem as leading to the conclusion that there must be FTL physical propagations in nature might have missed some important subtleties regarding its application and ultimate meaning.

The locality assumption is that events at A cannot be directly causally affecting events at B during any given coincidence interval. Quantitatively at least, we know that this assumption is affirmed experimentally. Since there never will be a way to affirm or deny it qualitatively, I conclude that the assumption of locality is the best bet -- regardless what anyone thinks that Bell has shown.

And, because of the way these experiments must be set up and run, I conclude that the assumption of common cause is also a best bet regarding the deep cause(s) of the quantum experimental phenomena that, collectively, conform to the technical requirements for quantum entanglement.

So, yes, I'm in denial about what I think you (and lots of others) think the meaning of Bell's theorem is. But thanks for the good discussions and references, and I'll continue to read and think and keep an open mind about this (think of my denial as a sort of working assumption), and when I get another flash of insight (like the polariscope analogy), then I'll let you know. :smile:

When is a locality condition not, strictly speaking, a locality condition?
 
  • #83


Maaneli said:
4. I have not yet seen any evidence that local non-realism is a viable explanation of Bell inequality violations. I challenge you to come up with a mathematical definition of non-realist locality.

You are a challenging kind of guy... :)

All you need to do to explain the Bell Inequality violation is to say that particles do NOT have well-defined non-commuting attributes when not being observed. You deny then that there is an A, B and C in Bell's [14], as previously mentioned. (Denying this assumption is completely in keeping with the HUP anyway, even if not strictly required by it.)

And there is experimental evidence as well, but you likely don't think it applies. There is the work of Groeblacher et al which I am sure you know. I accept these experiments as valid evidence but think more work needs to be done before it is considered "iron clad".

So, there is a mathematical apparatus already: QM and relativity. So dumping non-locality as an option does not require anything new. The reason I think non-realism is appealling is because it - in effect - elevates the HUP but does not require new forces, mechanisms, etc. We can also keep c as a speed limit, and don't need to explain why most effects respect c but entanglement does not.

By the way, you might have been a bit harsh on ThomasT. We each have our own (biased?) viewpoints on some of these issues, including you. Clearly, you reject evidence against non-locality and reject non-realism as a viable alternative.
 
  • #84


vanesch said:
What I said was that the correlations as seen in an Aspect-like experiment, and as predicted by quantum theory, cannot be obtained by "looking simply at a common cause". That is, you cannot set up a classical situation where you look at a common property of something, and obtain the same correlations as those from quantum mechanics.
Sorry if I misparaphrased you, because you've helped a lot in elucidating these issues.

There is a classical situation which I think analogizes what is happening in optical Bell tests -- the polariscope. The measurement of intensity by the detector behind the analyzing polarizer in a polariscopic setup is analogous to the measurement of rate of coincidental detection in simple optical Bell setups. Extending between the two polarizers in a polariscopic setup is a singular sort of optical disturbance. That is, the disturbance that is transmitted by the first polarizer is identical to the disturbance that's incident on the analyzing polarizer. In an optical Bell setup, it's assumed that for a given emitted pair the disturbance incident on the polarizer at A is identical to the disturbance that's incident on the polarizer at B. Interestingly enough, both these setups produce a cos^2 functional relationship between changes in the angular difference of the crossed polarizers and changes in the intensity (polariscope) or rate of coincidence (Bell test) of the detected light.

vanesch said:
This is because classically, we only know of two ways to have statistical correlations C(A,B): A causes B (or B causes A) is one way, and A and B have a common cause C is the other effect. Example of the first:
- A is "setting of the switch" and B is "the light is on"
clearly, because setting the switch causes the light to be on or off, we will find a correlation between both.
Example of the second:
- you drive a Ferrari and you have a Rolex.
It is not true that driving Ferrari's makes you have a Rolex, or that putting on a Rolex makes you drive a Ferrari. So there's not "A causes B" or "B causes A". However, being extremely rich can cause you to buy a Ferrari as well as a Rolex. So there was a common cause: "being rich".

Well, Bell's theorem is a property that holds for the second kind of correlations.

So the violation of that theorem by observed or quantum-mechanically predicted correlations means that it cannot be a correlation that can be explained entirely by "common cause".
And yet it's a "common cause" (certainly not of the ordinary kind though) assumption that underlies the construction and application of the quantum mechanical models that pertain to the Bell tests, as well as the preparation and administration of the actual experiments.


vanesch said:
Yes, of course. That's because we have to look for *entangled* photons, which NEED to have a common source. But the violation of Bell's theorem simply means that it is not a "common cause of the normal kind", like in "being rich".
OK, the correlations are due to a unusual sorts of common causes then. This is actually easier to almost visualize in the experiments where they impart a similar torque to relatively large groups of atoms. The entire, separate groups are then entangled with respect to their common zapping. :smile: Or, isn't this the way you'd view these sorts of experiments?


vanesch said:
Bohmian mechanics explicitly shows how action-at-a-distance can solve the issue.
The problem with instantaneous-action-at-a-distance is that it's physically meaningless. An all-powerful invisible elf would solve the problem too. Just like instantaneous-actions-at-a-distance, the existence of all-powerful invisible elves is pretty hard to disprove. :smile:

vanesch said:
In fact, that's not surprising either. If you have action at a distance, (if A can cause B or vice versa) then all possible correlations are allowed, and there's no "Bell theorem" being contradicted. What's the problem with the EPR kind of setups is that the "causes" (the choices of measurement one makes) are space-like separated events, so action-at-a-distance would screw up with relativity. So people (like me) sticking to relativity refuse to consider that option. But it is a genuine option: it is actually by far the "most common sense" one.
I disagree. The most common sense option is common cause(s). Call it a working hypothesis -- one that has the advantage of not being at odds with relativity. You've already acknowledged that common cause is an option, just not normal common causes. Well, the submicroscopic behavior of light is a pretty mysterious subject, don't you think? Maybe the classical models of light are (necessarily?) incomplete enough so that a general and conclusive lhv explanation of Bell tests isn't (and maybe will never be) forthcoming.
 
  • #85


A shift of angle on this maybe in case folks are getting a little over heated...

If the entangled particles are modeled as the tensor product of two Hilbert Spaces
then the result is a combined wave function for two particles that behave as one
wave function.

But then we ask questions about the large euclidean separation between the particles
and how one particle 'knows' about the other's state (eg hidden variable). This seems an inconsistent question because there is only one wave function (albeit for two particles) and when something happens to the wave it would be instantaneous everywhere at once.

An additional helping analogy would be to consider a single wave packet for an electron or photon - Youngs Slits or similar.
We don't ask questions about the 'speed of probablities' between one end of the wave packet and the other - its all instant. There is no 'time separation' about where the particle statistically reveals itself. Similarly with entangle particles. Or am I barking up a wrong tree?
 
  • #86


DrChinese said:
You are a challenging kind of guy... :)

All you need to do to explain the Bell Inequality violation is to say that particles do NOT have well-defined non-commuting attributes when not being observed. You deny then that there is an A, B and C in Bell's [14], as previously mentioned. (Denying this assumption is completely in keeping with the HUP anyway, even if not strictly required by it.)

And there is experimental evidence as well, but you likely don't think it applies. There is the work of Groeblacher et al which I am sure you know. I accept these experiments as valid evidence but think more work needs to be done before it is considered "iron clad".

So, there is a mathematical apparatus already: QM and relativity. So dumping non-locality as an option does not require anything new. The reason I think non-realism is appealling is because it - in effect - elevates the HUP but does not require new forces, mechanisms, etc. We can also keep c as a speed limit, and don't need to explain why most effects respect c but entanglement does not.

By the way, you might have been a bit harsh on ThomasT. We each have our own (biased?) viewpoints on some of these issues, including you. Clearly, you reject evidence against non-locality and reject non-realism as a viable alternative.


Haha good one about my being challenging.

About your non-realist locality definition, what do you do about collapse of the measurement settings to definite values? What causes it and when does it happen? What is your mathematical description of that process.

I don't think it was harsh. There is a big difference between rejecting what is unambiguously wrong from the POV of Bell's theorem, and being very skeptical about another possbility (non-realist locality) which you also admit still has to be worked out. Also, as I said, I don't actually think the nonlocality explanation is necessarily the best one. In fact I am much more inclined to think the causality assumption is the more unphysical assumption that must be given up, rather than locality. But that is clearly implied by Bell's theorem.
 
  • #87


LaserMind said:
We don't ask questions about the 'speed of probablities' between one end of the wave packet and the other - its all instant. There is no 'time separation' about where the particle statistically reveals itself. Similarly with entangle particles. Or am I barking up a wrong tree?

This is the collapse of the wavefunction, and I think this manifests itself identically whether we are talking about 1 particle or a pair of entangled particles.

Any single photon (say emitted from an electron) has a chance of going anywhere and being absorbed. The odds of it being absorbed at Alice's detector are A, and the odds of it being absorbed at Bob's detector is B. And so on for any number of possible targets, some of which could be light years away. When we observe it at Alice, that means it is NOT at Bob or any of the other targets. Yet clearly there was a wave packet moving through space - that's what experiments like the Double Slit show, because there is interference from the various possible paths. And yet there we are at the end, the photon is detected in one and only one spot. And the odds collapse to zero at everywhere else. And that collapse would be instantaneous as best as I can tell.

So this is analogous to the mysterious nature of entanglement, yet I don't think it is really any different. Except that entanglement involves an ensemble of particles.
 
  • #88


Maaneli said:
Haha good one about my being challenging.

About your non-realist locality definition, what do you do about collapse of the measurement settings to definite values? What causes it and when does it happen? What is your mathematical description of that process.

I don't have a physical explanation for instantaneous collapse. I think this is a weak point in QM. But I think the mathematical apparatus is already there in the standard model.

By the way, challenging is good as far as I am concerned. Not sure I am up to too many though. :)
 
  • #89


MWI is a perfect example of locality and non-realism. There is no objective state that other parts of your world are in until you interact with them and go through decoherence.
 
  • #90


DrChinese said:
I don't have a physical explanation for instantaneous collapse. I think this is a weak point in QM. But I think the mathematical apparatus is already there in the standard model.

By the way, challenging is good as far as I am concerned. Not sure I am up to too many though. :)


<< I don't have a physical explanation for instantaneous collapse. I think this is a weak point in QM. >>

Well, let's distinguish textbook QM mathematics and measurement postulates from the interpretation of it all as anti-realist. Certainly the ad-hoc, mathematically and physically vague measurement postulates are a weak point of textbook QM. But if your anti-realist interpretation has the same basic problem, then it cannot be a true physical theory of QM measurement processes, or a potentially fundamental physical interpretation of QM. That's why I still have not yet seen any coherent physical/mathematical definition of "anti-realist locality".


<< But I think the mathematical apparatus is already there in the standard model. >>

Ah but that's the thing. Standard QM has only postulates - no mathematical apparatus for treating measurement processes! Not even adding decoherence theory does the job fully! And anyway decoherence theory implies realism. That's why I think that if you don't want to invoke additional (hidden) variables to QM, and want to keep only with the wavefunction and HUP, and not try to analyze measurement processes, the only self-consistent interpretation is Ballentine's statistical interpretation of QM - but even that can only be a temporary filler to a more complete description of QM and, inevitably, a beable theory of QM.


<< By the way, challenging is good as far as I am concerned. Not sure I am up to too many though. :) >>

Glad you think so!
 
Last edited:

Similar threads

  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 18 ·
Replies
18
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
483
Replies
9
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 112 ·
4
Replies
112
Views
12K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
8
Views
2K