Graduate First loophole-free Bell test?

  • Thread starter Thread starter bohm2
  • Start date Start date
  • Tags Tags
    Bell Test
Click For Summary
The discussion centers on a recent Bell test that claims to address both the detection and locality loopholes, raising questions about whether it can be deemed truly loophole-free. While the experiment eliminates various experimental loopholes by adhering to a rigorous protocol, it does not escape metaphysical loopholes, such as super-determinism. The test utilizes entangled electron spins separated by 1.3 km, confirming quantum nonlocality without the need for additional assumptions. Experts in the field have praised the experiment for its ingenuity and potential significance in quantum physics. Overall, while it represents a significant advancement, some argue that it still cannot be classified as entirely loophole-free.
  • #31
stevendaryl said:
I would only call one of them crackpot. I'm not going to say which, but Richard Gill certainly knows which one.

OK I am provoked and will give my reactions to the list:

- Copenhagen - nature is local, but objective reality does not exist (Bohr, Mermin, Rovelli-relational, Zeilinger, ...)

I think this is an incorrect view of the Copenhagen interpretation. Measurement outcomes are objectively real. So there is an objective reality. In fact, I buy this one.
- many worlds - objective reality exists and is "local", but not in the 3-space (Everett, Deutsch, Tegmark, ...)

I think this is many words - a smoke screen of words which act as a comfort blanket.
- superdeterminism - objective reality exists, it is local and deterministic, but initial conditions are fine tuned ('t Hooft)

Yes, sure, initial conditions are so fine tuned that the diamond at Alice's place knows all about the pseudo random number generator at Bob's place.
- backward causation - objective reality exists and is local, but there are signals backwards in time (transactional interpretation)

If you want to call that an interpretation...- noncommutative hidden variables - objective reality exists and is local, but is not represented by commutative numbers (Joy Christian)

Pity about the math errors and the new definition of correlation and general lack of any connection to physics in this so-called theory.- solipsistic hidden variables - objective reality exists and is local, but objective reality describes only the observers, not the observed objects (H. Nikolic, http://xxx.lanl.gov/abs/1112.2034 )

Sounds like a word game to me. - consistent histories - objective reality exists and is local, but classical propositional logic is replaced with a different logic (Griffiths,http://lanl.arxiv.org/abs/1110.0974, http://lanl.arxiv.org/abs/1105.3932 )

I have never understood how this succeeds in explaining anything. Basically: let's assume that reality is weird, then QM is no longer weird.
Sorry for being a bit "abrasive". I have thought about all this a great deal the last 20 years and I'm getting old and dogmatic ...
 
  • Like
Likes Ilja
Physics news on Phys.org
  • #32
gill1109 said:
- solipsistic hidden variables - objective reality exists and is local, but objective reality describes only the observers, not the observed objects (H. Nikolic, http://xxx.lanl.gov/abs/1112.2034 )

Sounds like a word game to me.

It's more than that. Usually one does some handwaving with the "brain in a vat". But what are the actual equations governing the brain and its stimulation? Here he provides the equations.

gill1109 said:
- Copenhagen - nature is local, but objective reality does not exist (Bohr, Mermin, Rovelli-relational, Zeilinger, ...)

I think this is an incorrect view of the Copenhagen interpretation. Measurement outcomes are objectively real. So there is an objective reality. In fact, I buy this one.

I don't think this is the standard Copenhagen either. The standard Copenhagen has objective reality, eg. Landau and Lifshitz and Weinberg. I too buy standard Copenhagen.
 
  • #33
atyy said:
It's more than that. Usually one does some handwaving with the "brain in a vat". But what are the actual equations governing the brain and its stimulation? Here he provides the equations.
Thanks. I will take a look.
 
  • #34
gill1109 said:
Thanks. I will take a look.

Also, just in case you care about insulting people to their face, Nikolic is a regular, and well-respected, participant in this forum (where he uses a pseudonym).
 
  • #35
gill1109 said:
- noncommutative hidden variables - objective reality exists and is local, but is not represented by commutative numbers (Joy Christian)

Pity about the math errors and the new definition of correlation and general lack of any connection to physics in this so-called theory.

You were good to give such a serious criticism. But did you read Scott Aaronson's hilarious commentary on your criticism?

"Now, as Gill shows, Joy actually makes an algebra mistake while computing his nonsensical “correlation function.” The answer should be -a.b-a×b, not -a.b. But that’s truthfully beside the point. It’s as if someone announced his revolutionary discovery that P=NP implies N=1, and then critics soberly replied that, no, the equation P=NP can also be solved by P=0." http://www.scottaaronson.com/blog/?p=1028
 
  • Like
Likes jerromyjon and stevendaryl
  • #36
Superdeterminism and retrocausality first strike me as ridiculous interpretations. But I'm not 100% ready to say that they are nonsense. The reason we view these as ridiculous is because of our intuitions about the asymmetry between past and future. But physics doesn't really have a good explanation for that asymmetry that isn't ad hoc.
 
  • #37
stevendaryl said:
Also, just in case you care about insulting people to their face, Nikolic is a regular, and well-respected, participant in this forum (where he uses a pseudonym).
I do care about that. But I am not trying to insult any people. I'm just telling you my personal reaction to certain ideas. Maybe this just shows that I didn't work hard enough yet to understand those ideas. (And BTW I am just a mathematician / statistician, not a physicist, so perhaps not qualified to say much here at all).
 
  • #38
atyy said:
You were good to give such a serious criticism. But did you read Scott Aaronson's hilarious commentary on your criticism?

"Now, as Gill shows, Joy actually makes an algebra mistake while computing his nonsensical “correlation function.” The answer should be -a.b-a×b, not -a.b. But that’s truthfully beside the point. It’s as if someone announced his revolutionary discovery that P=NP implies N=1, and then critics soberly replied that, no, the equation P=NP can also be solved by P=0." http://www.scottaaronson.com/blog/?p=1028

Yes. Well that's the interesting thing about Christian's "theory". He brings up some really interesting topics, such as quaternions and how they relate to the 3-sphere, and so forth. It's easy to get lost in those topics (I personally spent a lot of time getting up to speed on what Joy Christian was talking about.) But the bottom line is that no matter how interesting his model is, what he's claiming to do is ridiculous, and that can be shown in one line: Bell proved that there can be no functions A(\lambda, \alpha), B(\lambda, \alpha) that take values in \{ +1, -1 \} satisfying blah, blah, blah, and Christian is constructing quaternion-valued functions. No matter how interesting his functions, they can't possibly refute Bell.
 
  • Like
Likes Ilja and gill1109
  • #39
gill1109 said:
(And BTW I am just a mathematician / statistician, not a physicist, so perhaps not qualified to say much here at all).

Bell wasn't qualified to do statistics either :)

BTW, do statisticians consider Pearl's work statistics, or something else? (Sorry, I know many others worked on it, but for biologists, that's maybe the most famous name.)
 
  • #40
atyy said:
Bell wasn't qualified to do statistics either :)

BTW, do statisticians consider Pearl's work statistics, or something else? (Sorry, I know many others worked on it, but for biologists, that's maybe the most famous name.)
Well Pearle comes from computer science but his work has big impact in statistics.

I think Bell's *statistical* insights and understanding were really good. Way above that of most of his colleagues. There was so much misunderstandings of what he'd done in the early years due to lack of statistical sophistication on the part of most of the physicists discussing his results.
 
  • Like
Likes atyy
  • #41
gill1109 said:
- solipsistic hidden variables - objective reality exists and is local, but objective reality describes only the observers, not the observed objects (H. Nikolic, http://xxx.lanl.gov/abs/1112.2034 )

Sounds like a word game to me.

FYI: This paper was written by Demystifier. Despite the interpretation, he is actually a Bohmian. But one of the few that actually considers other interpretations. So you can talk to him. :smile:
 
  • Like
Likes Demystifier
  • #42
It's impossible to discuss the significance of this experiment, and especially what questions are still open, without reference to various interpretations.

However, It would be a virtuous and good thing (yes, I know, my daughters have repeatedly explained to me that "virtuous and good" is parent-speak for "boring") if we could keep this thread from turning into a debate on the merits of the various interpretations. That's not a question that can be settled here.
 
  • #43
stevendaryl said:
Superdeterminism and retrocausality first strike me as ridiculous interpretations. But I'm not 100% ready to say that they are nonsense. The reason we view these as ridiculous is because of our intuitions about the asymmetry between past and future. But physics doesn't really have a good explanation for that asymmetry that isn't ad hoc.

Superdeterminism and Retrocausality really should not be grouped together. Out of respect for Nugatory's comment about discussing interpretations in this thread, I will leave it at that.
 
  • #44
DrChinese said:
Superdeterminism and Retrocausality really should not be grouped together. Out of respect for Nugatory's comment about discussing interpretations in this thread, I will leave it at that.

They seem very similar to me. It seems to me that a retrocausal theory, with back-and-forth influences traveling through time, can be reinterpreted as a superdeterministic theory, where the initial conditions are fine-tuned to get certain future results. In the end, you have a fine-tuned correlation between initial conditions and future events, and retrocausality would be a mechanism for achieving that fine-tuning.
 
  • #45
DrChinese said:
FYI: This paper was written by Demystifier. Despite the interpretation, he is actually a Bohmian. But one of the few that actually considers other interpretations. So you can talk to him. :smile:

Some of my best friends are Bohmians!
 
  • Like
Likes Ilja
  • #46
As I see it, there are two kinds of loopholes:

The proper loopholes, that specifiy some physical mechanism, like the detection loophole, the coincidence loophole, or the memory loophole. As far as I can see, they will all be closed now (I hope Hensen et.al. are still running the experiment in order to increase the sample size and reduce the p-value).

Then you have the metaphysical loopholes, that cannot even in principle be falsified by experiments. I have to side with Popper on this one: It's not science.
 
Last edited:
  • Like
Likes zonde
  • #47
stevendaryl said:
Superdeterminism and retrocausality first strike me as ridiculous interpretations. But I'm not 100% ready to say that they are nonsense. The reason we view these as ridiculous is because of our intuitions about the asymmetry between past and future. But physics doesn't really have a good explanation for that asymmetry that isn't ad hoc.

DrChinese said:
Superdeterminism and Retrocausality really should not be grouped together. Out of respect for Nugatory's comment about discussing interpretations in this thread, I will leave it at that.

Hello DrChinese, if you could please elaborate more on this in the thread below I would appreciate it.
https://www.physicsforums.com/threads/is-retrocausality-inherently-deterministic.829758/

Heinera said:
As I see it, there are two kinds of loopholes:

The proper loopholes, that specifiy some physical mechanism, like the detection loophole, the coincidence loophole, or the memory loophole. As far as I can see, they will all be closed now (I hope Hensen et.al. are still running the experiment in order to increase the sample size and reduce the p-value).

Then you have the metaphysical loopholes, that cannot even in principle be falsified by experiments. I have to side with Popper on this one: It's not science.

I couldn't agree more with this statement, as I feel any sort of local hidden variable "superdeterministic" conspiracy theory contemplation is a complete waste of time and energy.
 
  • #48
stevendaryl said:
They seem very similar to me. It seems to me that a retrocausal theory, with back-and-forth influences traveling through time, can be reinterpreted as a superdeterministic theory, where the initial conditions are fine-tuned to get certain future results. In the end, you have a fine-tuned correlation between initial conditions and future events, and retrocausality would be a mechanism for achieving that fine-tuning.

There is no need for fine-tuning in time-symmetric/retrocausal interpretations, any more than there is fine-tuning in Bohmian or MW interpretations. All predict the full universe of events follow the statistics of QM. Superdeterminism posits that there is a subset of events which match the stats of QM but the full universe does not.
 
  • #49
stevendaryl said:
Are you saying that every electron produced is eventually detected? (Or a sizable enough fraction of them?)

Yes, there is no loss on that side to speak of. As I see it, essentially the same pair of distant electrons (I guess really from the same pair of atoms) are being entangled over and over again, 245 times in this experiment. The entanglement itself occurs after* the random selection of the measurement basis for the Bell test is made, and too late to affect the outcome of the measurements (by propagation at c or less).

*Using a delayed choice entanglement swapping mechanism, see some of the PF threads on that for more information. Or read: http://arxiv.org/abs/quant-ph/0201134 and check out figure 1 on page 8. Photons 0 and 3 are replaced in our loophole-free test by electrons. Other than that, quite similar in space-time layout. Of course, in the loophole free version, some additional pretty cool things going on (literally).
 
  • #50
Let me see if I understand this right. Alice and Bob pick their settings, perform their measurements. During the process a photon is emitted from their respective electrons. Both photons are sent to station C. At station C, "entanglement swapping" (aka post-processing) is performed to decide if "state-preparation" was successful. They successfully "prepare the state " with a success probability of 6.4e-9! Only 245 successful "preparation" out of many millions of trials.

Maybe it's the wine I drank before reading the paper but, it looks to me like a detection loophole experiment done in reverse, then misinterpreted. I'll have to read it again in the morning. Has this thing even been peer-reviewed? Have any of you read it carefully?
 
  • #51
billschnieder said:
Let me see if I understand this right. Alice and Bob pick their settings, perform their measurements. During the process a photon is emitted from their respective electrons. Both photons are sent to station C. At station C, "entanglement swapping" (aka post-processing) is performed to decide if "state-preparation" was successful.
You have misunderstood the process. Look at figure 2a in the paper. First photon is emitted by NV center and sent to station C and only a moment later basis is selected.
 
  • #52
billschnieder said:
Let me see if I understand this right. Alice and Bob pick their settings, perform their measurements. During the process a photon is emitted from their respective electrons. Both photons are sent to station C. At station C, "entanglement swapping" (aka post-processing) is performed to decide if "state-preparation" was successful. They successfully "prepare the state " with a success probability of 6.4e-9! Only 245 successful "preparation" out of many millions of trials.

Maybe it's the wine I drank before reading the paper but, it looks to me like a detection loophole experiment done in reverse, then misinterpreted. I'll have to read it again in the morning. Has this thing even been peer-reviewed? Have any of you read it carefully?
I have read it very carefully. The experiment has been under preparation for two years and a stream of peer-reviewed publications have established all the components of the experiment one by one http://hansonlab.tudelft.nl/publications/ . The design of the present experiment was announced half a year ago. Two years ago I believe, they already did this with 1.5 metre separation.

Please take a look at Bell's (1981) "Bertlmann's socks", discussion of an experiment around figure 7. With the three locations A, B, C. This is exactly the experiment which they did in Delft.

The idea of having so-called "event-ready detectors" through entanglement swapping has been known since 1993 http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.71.4287

‘‘Event-ready-detectors’’ Bell experiment via entanglement swapping
M. Żukowski, A. Zeilinger, M. A. Horne, and A. K. Ekert
Phys. Rev. Lett. 71, 4287 – Published 27 December 1993

It's true that Alice and Bob are doing those measurements again and again and all but a tiny proportion of their attempts are wasted. They don't know in advance which measurements are the good ones, which ones aren't (because by the time a message arrived from the central location saying that this time it's the real thing, they would already be half finished with the measurement they are doing at that moment).

So there is a "post-selection" of measurement results. But the space-time arrangement is such that it cannot influence the settings being used for the measurements. Your computer does the selection retrospectively but effectively it was done in advance.
 
Last edited:
  • #53
gill1109 said:
Some of my best friends are Bohmians!
They believe in you even when they don't see you. :biggrin:
 
  • Like
Likes jerromyjon, Nugatory and atyy
  • #54
zonde said:
You have misunderstood the process. Look at figure 2a in the paper. First photon is emitted by NV center and sent to station C and only a moment later basis is selected.
Then why are they randomly switching between two different microwave pulses in order to generate the photons entangled with the electrons. Why not just a single pulse. It seems to me the settings are the microwave pulses and those are set before the photons are emitted. The readout happens later but the emitted photons already know the settings. How do they avoid setting dependent post-selection at C?
 
  • #55
gill1109 said:
Please take a look at Bell's (1981) "Bertlmann's socks", discussion of an experiment around figure 7. With the three locations A, B, C. This is exactly the experiment which they did in Delft.
That's a stretch. Bells event ready setup involves a third signal to Alice and Bob that an entangled pair was emitted. In this case, Alice and Bob's particles are not entangled to begin with. But photons from A and B are used at station C to select the sub-dnsemble of results that correspond to entanglement. No " event-ready" signal is ever sent to A and B.
So there is a "post-selection" of measurement results. But the space-time arrangement is such that it cannot influence the settings being used for the measurements. Your computer does the selection retrospectively but effectively it was done in advance.
Maybe, that's what is not so clear to me. Are the "settings" the microwave pulses P0 and P1, driven by RNGs?
 
  • #56
gill1109 said:
Please take a look at Bell's (1981) "Bertlmann's socks", discussion of an experiment around figure 7. With the three locations A, B, C. This is exactly the experiment which they did in Delft.

You had mentioned "Bertlmann's socks" before. I'm familiar with that essay by Bell, but I always took his device, with the boxes and settings, to be an intuitive abstraction of the EPR experiment. I never thought of it as a serious proposal for an actual experiment.
 
  • #57
Their p value is 0.04. Why is that loophole free?
 
  • #58
Heinera said:
Then you have the metaphysical loopholes, that cannot even in principle be falsified by experiments. I have to side with Popper on this one: It's not science.
I disagree. Loophole is loophole. One has to close it.

In the case of a metaphysical loophole, it is closed by accepting, as a sort of axiom or fundamental principle, something postulate which prevents it. This postulate, taken alone, cannot be tested by observation.

But this does not make such a postulate unphysical, not even from Popper's point of view. Popper has recognized from the start that not every particular statement of a physical theory can be tested, that one needs the whole theory to get predictions about real experiments. Then, answering Quine's holism, which claims that a single theory is not enough, but the whole of physics is necessary to make experimental predictions, he has recognized even more, namely that often even a whole theory taken alone is not sufficient to derive any nontrivial, falsifiable prediction. Each real experiment depends on a lot of different theories - in particular, theories about the accuracy of all measurement instruments involved.

Just for example that even famous theories taken alone do not give anything, take GR. Whatever the observed distribution of matter, and whatever the gravitational field, by defining dark matter as T_{mn}^{dark} = G_{mn}-T_{mn}^{obs} the Einstein equations of GR can be forced to hold exactly. One needs additional assumptions about properties of dark matter to derive anything from the Einstein equations. Else, all what is predicted by GR is nothing more than what is predicted by all metric theories of gravity - namely that what clocks measure may be described by a metric.

The point what makes it unnecessary to accept Quine's holism is that one can test the several theories involved in each actual experiment in other, independent experiments. This is, in particular, how one solves the problem of theories about measurement devices. You can test the measurement devices in completely different experiments, and this is what is done with real experimental devices. Say, their accuracy can be tested as by comparison with other devices, or (for the most accurate ones) by comparing other devices of the same type.

But, even if we can reject Quine's holism, the other extreme that single principles, taken alone, should be falsifiable, is nonsensical too.

But, once we cannot test them, taken alone, why should we accept them? There are some good reasons for accepting them.

For example, compatibility: Even if we have no TOE, a principle may be compatible with all the best available theories. Another point is what would be the consequence of rejection: It could be that, once it is rejected, one would have to give up doing science, because, if the rejection would be taken seriously, no experiment could tell us anything nontrivial. Superdeterminism would be of this type. Similarly a rejection of Reichenbach's principle of common cause: Once it is rejected, there would be no longer any justification to ask for realistic explanation of observed correlations. The tobacco lobby would be happy, no need to explain correlations of smoking and cancer, astrologers too, because the discussion about astrology would be reduced to statistical facts about correlations - are correlations between positions of planet with various things in our lifes significant or not, and the major point that there is no causal explanation for such influences would disappear.

So, there are possibilities for strong arguments in favour of physical principles, even if they, taken alone, cannot be tested.
 
  • #59
billschnieder said:
That's a stretch. Bells event ready setup involves a third signal to Alice and Bob that an entangled pair was emitted. In this case, Alice and Bob's particles are not entangled to begin with. But photons from A and B are used at station C to select the sub-dnsemble of results that correspond to entanglement. No " event-ready" signal is ever sent to A and B.
Maybe, that's what is not so clear to me. Are the "settings" the microwave pulses P0 and P1, driven by RNGs?
Sure, Bell was thinking of signals going from C to A and B. Now we have the opposite. But the end result is the same. There is a signal at C which says that at a certain time later it is worth doing a measurement at A and at B. We use the "go" signals at C to select which of the A and B measurements go into the statistics. The end result is effectively the same.
 
  • #60
stevendaryl said:
You had mentioned "Bertlmann's socks" before. I'm familiar with that essay by Bell, but I always took his device, with the boxes and settings, to be an intuitive abstraction of the EPR experiment. I never thought of it as a serious proposal for an actual experiment.
If you look at several other papers by Bell around the same time you will see that he was very very serious about finding a three-particle atomic decay so that one of the particles could be used to signal that the other two were on their way. Remember that Pearle's detection loophole paper was 10 years earlier. Bell well understood the problem with the experiments (like Aspect's) which were starting to be done at that time, where there was no control at all of when particles got emitted / measured.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 85 ·
3
Replies
85
Views
9K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 63 ·
3
Replies
63
Views
8K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
2K
  • · Replies 58 ·
2
Replies
58
Views
9K