# I Bell's Theorem and Reality

#### DrChinese

Gold Member
Yes, the proposed model is superdeterministic. I believe this is the principal point that makes it uninteresting.

I don't think that this is a problem because QM states do evolve deterministically. Bell also explains, in his famous 1964 paper, that hiddden variables may be considered as initial values that evolve dinamically with time.
They might, true, but they don't. No variation in time is discernible by the results - and it easily would be in any entanglement. The superposition follows a conservation rule. So there is no local time evolution, which is what the OP imagines.

#### DrChinese

Gold Member
1. Is your comment that “they never evolved from a common set of local hidden variables” misleading?

2. My understanding of the experiment so far is that the experiment selects the results that A and B had in common after learning at C whether the photons from A and B had some kind of correlation.

Also, in this experiment are the ZPL and PSB photons considered entangled?
1. No, definitely not.

2. An examination at C is performed, yes. However, A and B have no common light cone (in the sense that their prepared state never existed in a common light cone).

3. I don't think they are entangled, but I am not really sure about that.

#### stevendaryl

Staff Emeritus
They might, true, but they don't. No variation in time is discernible by the results - and it easily would be in any entanglement. The superposition follows a conservation rule. So there is no local time evolution, which is what the OP imagines.
As I said, I don't really understand what the authors are saying, but a loophole in Bell's theorem that they may be exploiting is differential detection. I assume that this loophole has been closed by experimentalists, but I'm not that familiar with the research:

Suppose that associated with every twin pair is a pair $a,b$ such that:
• If Alice chooses her detector orientation to be $\pm a$ (within a certain level of accuracy), then she will deterministically get $\pm 1$ (direction $a$ produces +1 and direction $-a$ produces -1).
• If she chooses any other orientation, she will not detect a particle at all.
• If Bob chooses orientation $\pm b$, he will get $\pm 1$ deterministically.
• For any other orientation, he will fail to measure anything at all.
If you ignore mismatches where only one or the other experimenter detects a particle, then the remainder could very well violate Bell's inequality.

Such an explanation would imply an upper bound on the detection efficiency, which probably has been ruled out by experiment, but I don't know.

#### DrChinese

Gold Member
1. As I said, I don't really understand what the authors are saying, but a loophole in Bell's theorem that they may be exploiting is differential detection. I assume that this loophole has been closed by experimentalists, but I'm not that familiar with the research:

Suppose that associated with every twin pair is a pair $a,b$ such that:
• If Alice chooses her detector orientation to be $\pm a$ (within a certain level of accuracy), then she will deterministically get $\pm 1$ (direction $a$ produces +1 and direction $-a$ produces -1).
• If she chooses any other orientation, she will not detect a particle at all.
• If Bob chooses orientation $\pm b$, he will get $\pm 1$ deterministically.
• For any other orientation, he will fail to measure anything at all.
2. If you ignore mismatches where only one or the other experimenter detects a particle, then the remainder could very well violate Bell's inequality.
1. I interpret this to mean that there is not rotational symmetry in the detection efficiency. Although I have not seen that specifically mentioned, it would be pretty obvious if it occurred. Certainly the raw counts I have seen in many experiments do not imply that there are differences in detection in any way related to the angle settings.

2. Sure, a Bell inequality violation could occur if there was some bias in the sample of events. There have been a number of experiments (such as the one I cited) in which *all* events are counted, and fair sampling need not be assumed. That's the equivalent of saying a non-detection is counted on the side of "local realism".

#### wle

The authors themselves say explicitly in the conclusion that they are exploiting the superdeterminism loophole:
Our factuality assumption implies a common cause on the detectors and particle creation process, which is encoded in the hidden variables. This exploits the so-called freedom of choice loophole, which appears when questioning independence of the detector settings, from the hidden variables that emerge at the creation of the entangled states [19].
In the context of Bell's theorem, "freedom of choice" means the same thing as "no superdeterminism". Earlier (page 5 of the ArXiv version) they say explicitly that not only the spins of particles but also the orientations of the detectors are determined by the hidden variable $\lambda$:
So, given $\lambda \in \Lambda$ and the time of measurement, $t_{1} \in \mathbf{R}$, $$\mathcal{F}(\lambda, t_{1}) = (\mathbf{o}_{A}, \mathbf{o}_{B}) \,,$$ where $\mathbf{o}_{A}$ and $\mathbf{o}_{B}$ are the spin projection orientations of each component of the pair, and they themselves are functions of $\lambda$ and $t_{1}$, $\mathbf{o}_{A}(\lambda, t_{1})$ and $\mathbf{o}_{B}(\lambda, t_{1})$. Note that these functions are absolutely deterministic, and a direct consequence of this is the fact that the orientation of the detectors is also encoded in $\lambda$. There is no what would have happened if the detector had not been in such and such orientation? The detector will have only one true orientation, determined by all the previous conditions accessible to it. This is what a truly deterministic scenario entails. A detector in a different orientation will have different values of $\lambda$ at all earlier times.
Of course, whether it is mathematically possible to violate Bell inequalities by exploiting this loophole is a different question from whether it is a good way to understand the violations of Bell inequalities seen in experiments in practice. I didn't see any exploration of the latter in the paper.

#### morrobay

Gold Member
In above superdeterministic model what would the expression be for all possible permutations : given 360 settings at A and B that are encoded in λ . Then that would be 3602 for detectors. Then how would encoded spins in λ at A and B be incorporated ?

#### wle

In above superdeterministic model what would the expression be for all possible permutations : given 360 settings at A and B that are encoded in λ . Then that would be 3602 for detectors. Then how would encoded spins in λ at A and B be incorporated ?
It's not difficult to show that you can you can potentially "explain" any correlations by exploiting the superdeterminism loophole. For example, take $\lambda$ to be of the form $\lambda = (r, s, \boldsymbol{a}, \boldsymbol{b})$ with $r, s \in \{-1, +1\}$ and where $\boldsymbol{a}$, $\boldsymbol{b}$ are unit vectors in $\mathbb{R}^{3}$. Then set $$\begin{eqnarray} \boldsymbol{o}_{\mathrm{A}}(t_{1}, \lambda) &=& r \cdot \boldsymbol{a} \,, \\ \boldsymbol{o}_{\mathrm{B}}(t_{1}, \lambda) &=& s \cdot \boldsymbol{b} \,, \\ \rho(\lambda) &=& p(r, s | \boldsymbol{a}, \boldsymbol{b}) q(\boldsymbol{a}, \boldsymbol{b}) \,. \end{eqnarray}$$ Then you can make $q(\boldsymbol{a}, \boldsymbol{b})$ be whatever probability (or probability density) you want that measurements along the axes $\boldsymbol{a}$ and $\boldsymbol{b}$ will be chosen, and make $p(r, s | \boldsymbol{a}, \boldsymbol{b})$ whatever probability you want that the result of measuring along axes $\boldsymbol{a}$ and $\boldsymbol{b}$ will be $(r, s) \in \{++, +-, -+, --\}$.

They don't have a model in any really useful sense though or discuss the magnitude of the task that would be. For example, suppose I planned to do a Bell experiment tomorrow in which two humans will (rapidly) choose the measurements to be done in the course of the experiment. In order for them to say in advance what their function $\mathcal{F}(t_{1}, \lambda)$ was they would need a physical theory that was comprehensive and detailed enough to predict not only what the spins are going to be but also human behaviour -- what measurements the humans are going to choose to do -- so that they can explain why they decide to do the right measurements at the right times for the experiment to show a Bell violation.

#### facenian

2. An examination at C is performed, yes. However, A and B have no common light cone (in the sense that their prepared state never existed in a common light cone).
I don't think this is correct since A and B must share a common past light cone. That is what locality meas and that is why they share the same value of lamda.

#### DrChinese

Gold Member
I don't think this is correct since A and B must share a common past light cone. That is what locality meas and that is why they share the same value of lamda.
Other than the fact that all experimental setups on this planet could be considered to exist in a common light cone: No, you are incorrect.

My point exactly is that the cited experiment does not have the possibility that the measurement settings could be part of any local set of hidden variables. There was never any contact between Alice and Bob (A & B). And there was no common source for the entangled particles. The entanglement occurs afterward, at C. Not before - as in many Bell tests.

Now wle is saying that the OP paper exploits the superdeterminism loophole. I'm not entirely sure about that, but there is plenty of indicators that wle is correct. The problem is that a paper touting superdeterministic ideas should be so labeled in the abstract. That it would not be borders on intentional misdirection.

In fairness: I consider superdeterminism to up there with witchcraft and conspiracy theories when it comes to considering it as science. But that is stuff for a different thread.

#### facenian

My point exactly is that the cited experiment does not have the possibility that the measurement settings could be part of any local set of hidden variables. There was never any contact between Alice and Bob (A & B). A
Thar's right measurements are spacelike separated

And there was no common source for the entangled particles. The entanglement occurs afterward, at C. Not before - as in many Bell tests.
This is not correct. You may want to check Bell's "The theory of local beables ". Here he explains the concept of local causality and draws a nice picture(fig.3) where the common causes responsable for the correlations are seen to lie in the overlap region of their past light cones. This is how the correlations are supposed to be locally explained by the hidden variables.

Now wle is saying that the OP paper exploits the superdeterminism loophole. I'm not entirely sure about that, but there is plenty of indicators that wle is .
Threre's no doubt @wle is correct. It is explicitly stated in the paper.

Last edited:

#### facenian

After this discussion here are two conclusions:
1. The paper is not very interesting because it relies on superdeterminism
2. The authors issue a wrong statement when saying that double slit experiments can be interpreted as an statistical effect.

Thank you guys!

#### DrChinese

Gold Member
This is not correct. You may want to check Bell's "The theory of local beables ". Here he explains the concept of local causality and draws a nice picture(fig.3) where the common causes responsable for the correlations are seen to lie in the overlap region of their past light cones. This is how the correlations are supposed to be locally explained by the hidden variables.
Sorry, you are mistaken (excepting of course that all experiments on Earth can be said to exist in a common light cone.

The reason is that the experiment I cited is NOT causally constructed like most Bell tests. The cited experiment had never been performed while Bell was alive, not sure it had even been conceived at that time. There have now been many experiments in which entanglement occurs without any causal contact between the entangled particles. So it makes no sense to construct a paper around an explanation of entanglement that ignores experiments such as what I cited. They have basically ignored decades of experimental work.

#### DrChinese

Gold Member
After this discussion here are two conclusions:
1. The paper is not very interesting because it relies on superdeterminism
2. The authors issue a wrong statement when saying that double slit experiments can be interpreted as an statistical effect.

Thank you guys!
1. Assuming everyone agrees the OP article is intended as a superdeterministic explanation of Bell, I would call such a paper a scientific sham. IMHO there's no science hiding there.

#### morrobay

Gold Member
Well from this complete paper by the authors : https://arxiv.org/abs/1605.0849T3
They are intending a deterministic explanation for the correlations:" In our view non - local correlations emerge from a deterministic evolution of a shared hidden variable between two components of the ontological pair. And this also agrees with @A. Neumaier and his definition of Extended causality: *
"Joint properties of an extended object depend only on the union of the closed past light cones of their constituent parts and can influence only the union of the closed future cones of their constituent parts."
* Can this definition of Extended causality apply to space like separated experiments ?

#### facenian

The reason is that the experiment I cited is NOT causally constructed like most Bell tests. The cited experiment had never been performed while Bell was alive, not sure it had even been conceived at that time. There have now been many experiments in which entanglement occurs without any causal contact between the entangled particles. So it makes no sense to construct a paper around an explanation of entanglement that ignores experiments such as what I cited. They have basically ignored decades of experimental work.
Sorry but I'm suspicious about your assertion. I agree that the precicion to test Bell theorem has been increasing with time, however you are saying that the basic concept of entanglement that Einstein and Bell were talking about has become obsolete rendering Bell's proofs inaplicable.
Since I'm not an expert I can not tell but I would like hear some other opinion on this point.

#### DrChinese

Gold Member
Sorry but I'm suspicious about your assertion. I agree that the precicion to test Bell theorem has been increasing with time, however you are saying that the basic concept of entanglement that Einstein and Bell were talking about has become obsolete rendering Bell's proofs inaplicable.
Nothing about Bell's Theorem has changed or diminished in the slightest. If you think that's what I am saying, please reconsider. What you were quoting is something else written by Bell about a specific type of Bell test. Bell tests have advanced greatly since to eliminate even more "loopholes". Some of those loopholes are detection/fair sampling and locality, which are simultaneously eliminated in this test. Also, and relevant to the OP's claims:

"Our experiment already excludes all models that predict that the random inputs are determined a maximum of 690 ns before we record them..."

That is in addition to the fact that Alice and Bob are not in causal contact. So there are no shared local variables related to the particles being measured. And the measurement settings themselves are determined independently by Alice and Bob a very short time before their measurements, far too short to have been communicated from one to the other.

That's quite a lot of physics for even a superdeterministic hypothesis to overcome.

#### stevendaryl

Staff Emeritus
Here's a question for Dr. Chinese, who is more familiar with these tests of entanglement. The usual way that a Bells-Theorem-violating twin pair is produced is something like the figure below:

There is some kind of decay process that produces an entangled particle/antiparticle. (I drew it as a photon decaying into an electron/positron pair, which actually isn't possible, but maybe you could substitute a $\pi_0$ meson to make it plausible. How the particles are produced are not important for this discussion.)

Now, let's vary the experiment by producing 4 particles, as shown in this figure:

In this figure, we produce two sets of correlated pairs. The question then is: Is there something that can be done to the two particles in the box--some kind of measurement maybe--that forces the remaining two particles to be entangled? Maybe if you measure the total spin of the two particles in the box, and find them to have total spin zero, then that will force the other particles to be anti-correlated?

If an experiment like this were possible, then it would be a way to produce entangled particles that have no common past. That would seem to be an even simpler argument against local hidden variables than Bell's theorem--the correlations can't possibly be explained in terms of local hidden variables if they never met in the past to share that hidden variable.

#### Attachments

• 10.1 KB Views: 363
• 20.3 KB Views: 441

#### DrChinese

Gold Member
And this also agrees with @A. Neumaier and his definition of Extended causality: *
"Joint properties of an extended object depend only on the union of the closed past light cones of their constituent parts and can influence only the union of the closed future cones of their constituent parts."
Entangled quantum objects need not have a common past light cone, nor a common future light cone. This can be seen by the existence of entangled pairs where the sub-components have never co-existed at any point in time.

The point to recognize here is that the technology today has advanced so far that the old ways of looking at entangled pairs is outmoded. Formerly, entangled pairs arose from a common source. That provided us an easy to picture model of entanglement which is intuitive (to some degree). Everything apparently moves forward in time:

(Future)
Alice: V :Bob
(Past)

However, entangled pairs can be created from fully independent sources as well. I.e. 2 lasers in different spatiotemporal regions. Those objects are then entangled via entanglement swapping at a third spacetime point. This is done in such a way as the spacetime cones of the swapping components is also independent. In fact this can be done after the other particles no longer exist, just to drive the point home.

Now, if you choose to define the extended object as being the region defined by ALL of its components at all times they exist, then the definition above is fine. However, the resulting causal diagram has arrows in BOTH directions of time. It looks something like (Chris would be in the middle, not labeled below):

(Future)
Alice: W :Bob
(Past)

Having those arrows going both ways in time is a problem for most people. So define it as you see fit.

#### DrChinese

Gold Member
Here's a question for Dr. Chinese, who is more familiar with these tests of entanglement. The usual way that a Bells-Theorem-violating twin pair is produced is something like the figure below:

View attachment 220547

There is some kind of decay process that produces an entangled particle/antiparticle. (I drew it as a photon decaying into an electron/positron pair, which actually isn't possible, but maybe you could substitute a $\pi_0$ meson to make it plausible. How the particles are produced are not important for this discussion.)

Now, let's vary the experiment by producing 4 particles, as shown in this figure:

View attachment 220548

In this figure, we produce two sets of correlated pairs. The question then is: Is there something that can be done to the two particles in the box--some kind of measurement maybe--that forces the remaining two particles to be entangled? Maybe if you measure the total spin of the two particles in the box, and find them to have total spin zero, then that will force the other particles to be anti-correlated?

If an experiment like this were possible, then it would be a way to produce entangled particles that have no common past. That would seem to be an even simpler argument against local hidden variables than Bell's theorem--the correlations can't possibly be explained in terms of local hidden variables if they never met in the past to share that hidden variable.
Ha, I just posted something like your diagram, except yours is much better...

#### DrChinese

Gold Member
Here's a question for Dr. Chinese, who is more familiar with these tests of entanglement. The usual way that a Bells-Theorem-violating twin pair is produced is something like the figure below:
...

In this figure, we produce two sets of correlated pairs. The question then is: Is there something that can be done to the two particles in the box--some kind of measurement maybe--that forces the remaining two particles to be entangled? Maybe if you measure the total spin of the two particles in the box, and find them to have total spin zero, then that will force the other particles to be anti-correlated?

If an experiment like this were possible, then it would be a way to produce entangled particles that have no common past. That would seem to be an even simpler argument against local hidden variables than Bell's theorem--the correlations can't possibly be explained in terms of local hidden variables if they never met in the past to share that hidden variable.
And the answer is a big YES.

Inside the box, you perform what is referred to as a Bell State Measurement (BSM). Some of these project Alice and Bob into an entangled state. When that happens, Alice and Bob are perfectly correlated* AS IF they were the pair originally entangled. That is called entanglement swapping.

https://arxiv.org/abs/quant-ph/0201134
https://arxiv.org/abs/0809.3991

So this is what I have been referring to in some of my prior posts. Of course, "conspiracy" theories will evolve even under these scientific challenges. However, the resulting (super)deterministic quantum theories becomes (laughably ) untenable with these scenarios. If someone were to postulate a specific way they worked, it could easily be falsified.

*Or anti-correlated, depending on the result of the BSM. Note that the BSM results are themselves random.

Last edited:

#### stevendaryl

Staff Emeritus
And the answer is a big YES.

Inside the box, you perform what is referred to as a Bell State Measurement (BSM). Some of these project Alice and Bob into an entangled state. When that happens, Alice and Bob are perfectly correlated* AS IF they were the pair originally entangled. That is called entanglement swapping.

https://arxiv.org/abs/quant-ph/0201134
https://arxiv.org/abs/0809.3991

*Or anti-correlated, depending on the result of the BSM. Note that the BSM results are themselves random.
Good. Then as I said, in some sense, it seems that Bell's inequality is almost unnecessary to demonstrate the impossibility of a local hidden-variables theory that could explain the correlations, since is there is no way for the entangled particles to acquire a common hidden variable.

#### rubi

The question then is: Is there something that can be done to the two particles in the box--some kind of measurement maybe--that forces the remaining two particles to be entangled?
The Hilbert space of the full system is of the form $\mathcal H = \mathcal H_1 \otimes \mathcal H_2 \otimes \mathcal H_3 \otimes \mathcal H_4$. It can be proven mathematically that if the observables of each particles commute in spacelike separated regions (which is true in the standard model), then the reduced density matrix on $\mathcal H_1 \otimes \mathcal H_4$ is completely independent of what you do to the particles in the box ($\mathcal H_2 \otimes \mathcal H_3$). It's an easy undergrad exercise. So no physical interaction in the box can physically entangle particle 1 &4. However, what you can do is produce entangled statistics by post-selecting events based on information that is obtained from particles 2 & 3. This is what's called entanglement swapping. It's not a physical process, but rather an operation that is applied to measurement data that has already been recorded after it has been recorded.

#### DrChinese

Gold Member
So no physical interaction in the box can physically entangle particle 1 &4. However, what you can do is produce entangled statistics by post-selecting events based on information that is obtained from particles 2 & 3. This is what's called entanglement swapping. It's not a physical process, but rather an operation that is applied to measurement data that has already been recorded after it has been recorded.
I would dispute such characterization. No one can be sure that the Bell State measurement of 2 & 3 does not cause a change to 1 & 4. The experiment must be viewed as a total context, and identifying the causal agents when there is quantum nonlocality is not really possible. This is same kind of issue as you have with a quantum eraser. Does the eraser cause a non-local change?

My point is not to start an argument about this, but to say that causality cannot be identified clearly unless you start with a specific interpretation. And we know where interpretations will take us...

#### stevendaryl

Staff Emeritus
I would absolutely dispute such characterization. No one can be sure that the Bell State measurement of 2 & 3 does not cause a change to 1 & 4.
But depending on the spacetime separation, the measurements of 2&4 can be spacelike separated from the subsequent measurements of 1&4. So I can't see how there could be a local hidden-variables explanation.

The experiment must be viewed as a total context, and identifying the causal agents when there is quantum nonlocality is not really possible. This is same kind of issue as you have with a quantum eraser. Does the eraser cause a non-local change?
I'm only saying that the experiment seems incompatible with a local realistic explanation.

#### DrChinese

Gold Member
I'm only saying that the experiment seems incompatible with a local realistic explanation.
Oh, I agree completely. These experiments are even stronger arguments against local realistic explanations than the original Bell types.

And just to be clear: when I say the the Bell State Measurement of 2 & 3 projects 1 & 4 into an entangled state, I don't mean that it is doing so in local realistic terms. This is strictly a quantum nonlocal description. This is the usual description given by authors of these papers. Note: You could also say that the measurement of 1 & 4 casts 2 & 3 into a Bell State.

Last edited:

"Bell's Theorem and Reality"

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving