Photon entanglement and fair sampling assumption

Click For Summary
The discussion centers on the validity of the fair sampling assumption in photon entanglement experiments, which is crucial for applying Bell's inequalities. It argues that since only a portion of emitted photons is detected, the assumption that the detected sample is representative may not hold, raising doubts about claims of nonlocality and Bell's inequality violations. Three proposed experiments aim to challenge this assumption: one examines correlations in three-photon entanglement, another tests the superposition of wavefunctions post-polarization, and the third investigates how detection efficiency impacts coincidence rates. The conversation highlights the need for rigorous testing against systematic errors and questions why such discussions are infrequent in the field. Overall, the validity of the fair sampling assumption remains a contentious issue that warrants further exploration.
  • #61
zonde said:
1. No. Otherwise we are not discussing unfair sampling.


2. So in case of Type I PDC if we talk about polarization entanglement there have to be some more details about the setup how produced (polarization non-entangled) photons are turned into polarization entangled photons.

1. You can have an unfair sample (of the universe of photon pairs), but it still must be realistic! There must be 3 simultaneous values for Alice at 0, 45, 67.5. Otherwise you are just saying it is a realistic model when it isn't. That is the point of Bell.

2. Yes, it is difficult to model "realistically". (Pilot wave theorists don't think so, but it is.)
 
Physics news on Phys.org
  • #62
DrChinese said:
My point is that local realists struggle to prove Bell/Aspect wrong, failing to realize that their hypothesis is elsewhere contradicted. That is why the Bell Theorem states that no LHV theory can reproduce ALL of the predictions of quantum theory.

I tend to agree with this wording. However, I am not sure this is bad "news" for LHV and local realists. Without taking sides with local realists or against them here, I tend to think this is actually great "news" for them. The reasoning is as follows (a part of it was offered by nightlight).

1. Predictions of quantum theory include both unitary evolution and the projection postulate.

2. To prove the Bell theorem, one needs both unitary evolution and the projection postulate.

3. Strictly speaking, unitary evolution and the projection postulate directly contradict each other.

4. Inability of LHV theories to reproduce contradictory results is good for local realists.

As some of these points are not obvious, let me explain.

1. This statement seems obvious as far as unitary evolution is concerned. If you disagree that the projection postulate is also a prediction of quantum theory, please advise (I admit that this is not an obvious statement, as it depends on the interpretation of quantum theory. What is important for me, however, is that this postulate or something similar is required to prove the Bell theorem - see below).

2. One needs unitary evolution when one assumes that spin projection on any axis is conserved. One needs the projection postulate to prove that quantum theory violates the Bell inequalities (it is used to compute the correlations in quantum theory).

3. Indeed, the projection postulate necessitates irreversibility, and, strictly speaking, unitary evolution does not allow any irreversibility (let me mention, e.g., the quantum recurrence theorem (Phys. Rev. V.107 #2, pp.337-338, 1957)), so a particle, strictly speaking, does not stay in the eigenstate after measurement (if it was in a superposition before the measurement).

4. Seems obvious
DrChinese said:
Once you understand the full implications of the requirement, it becomes a much larger issue to overcome. That is why Santos, Hess and others have failed, because they have stumbled in postulating a full and consistent LHV hypothesis that actually leads to the predictions of QM.

As I said, maybe it’s good for them that they failed. Interestingly, in a recent article (http://arxiv.org/PS_cache/arxiv/pdf/0912/0912.4098v1.pdf) Santos argues that “the usual postulates of quantum are too strong”. Again, I am not taking sides with Santos or against him here. I believe, however, that, on the one hand, the proof of the Bell theorem uses mutually contradictory assumptions, on the other hand, so far no experiment has demonstrated violations of the Bell inequalities without some dubious additional assumptions, such as “fair sampling”. So I am not sure there are sufficient theoretical or experimental arguments proving that “local hidden variable theories are not tenable.”
 
  • #63
akhmeteli said:
As I said, maybe it’s good for them that they failed. Interestingly, in a recent article (http://arxiv.org/PS_cache/arxiv/pdf/0912/0912.4098v1.pdf) Santos argues that “the usual postulates of quantum are too strong”. Again, I am not taking sides with Santos or against him here. I believe, however, that, on the one hand, the proof of the Bell theorem uses mutually contradictory assumptions, on the other hand, so far no experiment has demonstrated violations of the Bell inequalities without some dubious additional assumptions, such as “fair sampling”. So I am not sure there are sufficient theoretical or experimental arguments proving that “local hidden variable theories are not tenable.”

That is an "interesting" perspective, since you are basically saying failure is good. :smile:

The problem with the LR perspective is that they do not work against the opposition's strongest arguments, they seek the weakest to challenge. I consider fair sampling to be one of the worst possible attacks as the hypothesis is born out of LR anger and frustration and little else. As I have said before, virtually every scientific experiment relies on the fair sampling assumption and there is nothing special about it with respect to a Bell test.

On the other hand, the opposition (which is of course the mainstream) consistently challenge themselves at the highest level. For example, there are new and improved Bell tests every year. Entanglement is being sought - and discovered - in new and usual places. On the other hand, LRists basically deny the existence of entanglement (since they say coincidences are predetermined and not a result of an ongoing state).

So while the LR camp is grasping at straws (that's how it appears to me), I have read papers finding entanglement under every conceivable rock - including entanglement of particles that are outside of each other's light cones! And as predicted by QM.

As to Bell using mutually contradictory assumptions: all Bell is saying is that LR predictions can never match QM. If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions. If QM is shown to be experimentally wrong tomorrow, then so be it. But the predictions of QM are still the predictions of QM, and I don't know anyone who sees any confusion (or contradiction) in the cos^2(theta) rule.

But you are missing a truly important point of Bell: At the time it was introduced, it was widely believed that a local realistic version (a la Einstein's perspective) was tenable. Entanglement had never been witnessed! So maybe QM was wrong. But 45 years later, the story has not played out as Einstein might have imagined.

As to Santos suggesting that QM should be relaxed: yes, I saw that paper and laughed. I mean, who is he kidding? Hey, let's change the rules so Santos can convince himself LR is viable and he is right in the end. It's science, where's the beef Santos? I would love to see Santos stick with a theory for an entire year and use it to discover new sources of entanglement that were previously unknown. That would impress me.

In the meantime, there are numerous NEW theorems that are fully independent of Bell but which ALSO rule out the LR position. Examples are GHZ, Leggett, etc. and guess what: they don't rely on the "unfair" sampling assumption. So the LR position is being left in the dust as science advances. So I guess I am disagreeing with your assessment. LR is not tenable and the evidence is getting stronger, not weaker.
 
  • #64
DrChinese.
Thank you very much for a prompt a detailed reply. Let me try to comment.
DrChinese said:
That is an "interesting" perspective, since you are basically saying failure is good. :smile:
I am not just saying that failure is good in this case, I am also saying why: because “success” would be fatal for the potential “successful” theory. Indeed, if your theory has two contradictory conclusions, or assumptions, that means the theory is, strictly speaking, wrong. By the way, for this very reason quantum theory, in the specific form used to prove the Bell theorem, is, strictly speaking, wrong. Mathematically wrong. It does contain two contradictory assumptions. One of these assumptions must be wrong – logic does not allow any other conclusion. Specifically, I believe that unitary evolution (UE) is right, and the projection postulate (PP) is, strictly speaking, wrong. This is just my opinion, so you may agree or disagree, but you just cannot have both UE and PP, for the simple reason that they contradict each other, and you don’t seem to dispute that. If you do, please advise. In the following I won’t repeat this caveat and will assume that it is PP that is wrong. PP may be a good approximation, it may be a very good approximation, it may be an excellent approximation, it may be an amazingly great approximation, but the bottom line is it’s just an approximation. It just cannot be precise, because if it is, then UE has its share of problems.
DrChinese said:
The problem with the LR perspective is that they do not work against the opposition's strongest arguments, they seek the weakest to challenge.
Maybe I don’t quite understand you, or my English fails me, but I don’t quite see what is wrong about going against the weakest argument of the opponent. I would think in any contest the opponent’s weakest point is fair game. Furthermore, we are not in a court room, I think we both are just trying to understand something better, so I would think we should just agree with each other’s strongest argument, rather than waste time refusing to concede what we believe is actually correct in the opponent’s arguments.
DrChinese said:
I consider fair sampling to be one of the worst possible attacks as the hypothesis is born out of LR anger and frustration and little else. As I have said before, virtually every scientific experiment relies on the fair sampling assumption and there is nothing special about it with respect to a Bell test.
I don’t quite get it. Such people as Shimony and Zeilinger, who are no fans of LR, admit that the “detection loophole” (and, consequently, the fair sampling assumption) presents a serious problem (see the relevant quotes at https://www.physicsforums.com/showpost.php?p=1702189&postcount=13 and https://www.physicsforums.com/showpost.php?p=1705826&postcount=65 ). Do you really believe we should accept the fair sampling assumption without discussion? You yourself gave an example where this assumption may be less than obvious – “An example would be celestial objects used as "standard candles".” I guess the following reasoning by Santos makes some sense: “In the context of LHV theories the fair sampling assumption is, simply, absurd. In fact, the starting point of any hidden variables theory is the hypothesis that quantum mechanics is not complete, which essentially means that states which are considered identical in quantum theory may not be really identical. For instance if two atoms, whose excited states are represented by the same wave-function, decay at different times, in quantum mechanics this fact may be attributed to an ”essential indeterminacy”, meaning that identical causes (identical atoms) may produce different effects (different decay times). In contrast, the aim of introducing hidden variables would be to explain the different effects as due to the atomic states not being really identical, only our information (encapsuled in the wave-function) being the same for both atoms. That is, the essential purpose of hidden variables is to attribute differences to states which quantum mechanics may consider identical. Therefore it is absurd to use the fair sampling assumption -which rests upon the identity of all photon pairs- in the test of LHV theories, because that assumption excludes hidden variables a priori.”

DrChinese said:
On the other hand, the opposition (which is of course the mainstream) consistently challenge themselves at the highest level. For example, there are new and improved Bell tests every year.
I agree, there are “new and improved Bell tests every year”. However, so far the result is always the same: no violation of the genuine Bell inequalities. For some reason there is always something: either the detection loophole, or locality loophole, you name it. 45 years and counting – no violations. That reminds me the following words from Heller’s “Catch-22”:
"I've got just the twelve-year-old virgin you're looking for," he announced jubilantly. "This twelve-year-old virgin is really only thirty-four, but she was brought up on a low-protein diet by very strict parents and didn't start sleeping with men until"

This is the same stuff that we hear about the Bell inequalities violations (BIV): “Yeah, we demonstrated violations, they are as good as genuine ones, even better. Detection loophole? Oh, come on, you’re nit-picking. Locality loophole? Oh, come on, you’re hair-splitting”.

You believe that BIV have been demonstrated to your satisfaction? I fail to see any such demonstrations, sorry.
DrChinese said:
Entanglement is being sought - and discovered - in new and usual places. On the other hand, LRists basically deny the existence of entanglement (since they say coincidences are predetermined and not a result of an ongoing state).

So while the LR camp is grasping at straws (that's how it appears to me), I have read papers finding entanglement under every conceivable rock - including entanglement of particles that are outside of each other's light cones! And as predicted by QM.
I don’t know, I fail to see how entanglement can eliminate LR, as existence of entanglement is not enough to prove the Bell theorem. You need the projection postulate. You are a knowledgeable person, so I am sure you appreciate that “entanglement of particles that are outside of each other's light cones” per se does not eliminate LR. In general, the only thing that could be fatal to LR is genuine BIV (that is, if we forget about superdeterminism). So far genuine BIV have not been demonstrated, and I don’t hold my breath.
DrChinese said:
As to Bell using mutually contradictory assumptions: all Bell is saying is that LR predictions can never match QM. If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions. If QM is shown to be experimentally wrong tomorrow, then so be it. But the predictions of QM are still the predictions of QM, and I don't know anyone who sees any confusion (or contradiction) in the cos^2(theta) rule.
I don’t get it. I specifically indicated the two mutually contradictory assumptions that are both predictions of QM and necessary to prove the Bell theorem. So while I could agree that “If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions.”, this is not relevant, because the proof of the Bell theorem is indeed based on two mutually contradictory assumptions, and I specifically indicated that, showing where the proof uses UE and PP. As for the cos^2(theta) rule, when you use it for both particles of the singlet, I believe you need the projection postulate (to count the QM correlations), and PP directly contradicts UE.
DrChinese said:
But you are missing a truly important point of Bell: At the time it was introduced, it was widely believed that a local realistic version (a la Einstein's perspective) was tenable.
I don’t know. My impression was that the Copenhagen interpretation’s grip on physics was much stronger then than now. But I may be mistaken.
DrChinese said:
Entanglement had never been witnessed! So maybe QM was wrong. But 45 years later, the story has not played out as Einstein might have imagined.
Again, entanglement does not eliminate LR. And Einstein is no relative of mine. It is my understanding he opposed the uncertainty principle. So he was wrong on this issue (at least I believe so). But the uncertainty principle per se does not eliminate LR either. On the other hand, Einstein’s EPR paper led to significant progress.
DrChinese said:
As to Santos suggesting that QM should be relaxed: yes, I saw that paper and laughed. I mean, who is he kidding? Hey, let's change the rules so Santos can convince himself LR is viable and he is right in the end. It's science, where's the beef Santos? I would love to see Santos stick with a theory for an entire year and use it to discover new sources of entanglement that were previously unknown. That would impress me.
Neither is Santos any relative of mine:-) I just mentioned his paper as an example where a local realist appreciates that he cannot and does not need to emulate all predictions of QM.
DrChinese said:
In the meantime, there are numerous NEW theorems that are fully independent of Bell but which ALSO rule out the LR position.
Are they independent on such things as PP?
DrChinese said:
Examples are GHZ, Leggett, etc. and guess what: they don't rely on the "unfair" sampling assumption.
I don’t quite get it. Neither does the standard Bell theorem rely on the “fair” or “unfair” sampling assumption. FS is used to interpret experimental results as violating the Bell inequalities. I readily admit that I don’t know much about GHZ, Leggett etc., but I suspect they basically have the same problems as the Bell theorem. For example, I have not heard anybody state that they were successfully used to conduct loophole-free experiments eliminating LR.
DrChinese said:
So the LR position is being left in the dust as science advances. So I guess I am disagreeing with your assessment. LR is not tenable and the evidence is getting stronger, not weaker.
My assessment is there are neither no-go theorems nor experimental data eliminating LR. But I certainly respect your point of view.
 
Last edited by a moderator:
  • #65
Y'know, this is a complaint that I've never understood, because we have arrived at solid conclusions based on flimsier evidence than this. Let's examine 2 classes of the Bell-type experiments.

1. Bell-violating experiments using light.

Now, everyone agrees that the plethora of experiments (and let's be honest here, there have been PLENTY OF THEM with NOT ONE SINGLE evidence pointing to the contrary) that showed violation of Bell, etc. inequality, even for multipartite system. ALL of these experiments closed the locality loophole. However, the biggest complaint here is the detection loophole, and somehow, there are still people not convinced of the results of the experiments (even though ALL of them give the IDENTICAL conclusion).

2. Bell-violating experiments using matter.

These can be done using charge carriers, or even qubits (see, for example, M. Ansmann et al., Nature v.461, p.504 (2009)). There has been ZERO question that ALL of these experiments closed the detection loophole - you can detect them one at a time without any need for a fair-sampling treatment. The loophole that these experiment can't close right now is the locality loophole, since these are experiments done on very small scale, although there are indications that using the technique of Ansmann et al., there's a possibility that the system might be robust enough to extend to a large length scale and close this loophole as well.

So what do we have here. We have a set of test for a single principle, in which the tests are conducted in various different manner, coming from very different angles, and testing different aspects of it. It is an AMAZING FACT that ALL of them produce a consistent result! This fact seems to be severely overlooked! I mean, think about it for second! It is astounding that each of these experiments that close each of the different loopholes produce the SAME, IDENTICAL result, and not only that, the result having such HIGH CONFIDENCE (the Ansmann et al. experiment, for example, produced a result that exceeded 244 standard deviations!. It's not even funny!

I can understand if there are some indications from some experiment somewhere that a test has produced something to the contrary. The FACT that even this doesn't even exist, and yet, there are people here who are somehow CONVINCED, for some odd reason, that this whole thing is "wrong" (which is a very strong word), now THAT is utterly baffling.

Zz.
 
  • #66
ZapperZ said:
Y'know, this is a complaint that I've never understood, because we have arrived at solid conclusions based on flimsier evidence than this. Let's examine 2 classes of the Bell-type experiments.

1. Bell-violating experiments using light.

Now, everyone agrees that the plethora of experiments (and let's be honest here, there have been PLENTY OF THEM with NOT ONE SINGLE evidence pointing to the contrary) that showed violation of Bell, etc. inequality, even for multipartite system. ALL of these experiments closed the locality loophole. However, the biggest complaint here is the detection loophole, and somehow, there are still people not convinced of the results of the experiments (even though ALL of them give the IDENTICAL conclusion).

2. Bell-violating experiments using matter.

These can be done using charge carriers, or even qubits (see, for example, M. Ansmann et al., Nature v.461, p.504 (2009)). There has been ZERO question that ALL of these experiments closed the detection loophole - you can detect them one at a time without any need for a fair-sampling treatment. The loophole that these experiment can't close right now is the locality loophole, since these are experiments done on very small scale, although there are indications that using the technique of Ansmann et al., there's a possibility that the system might be robust enough to extend to a large length scale and close this loophole as well.

So what do we have here. We have a set of test for a single principle, in which the tests are conducted in various different manner, coming from very different angles, and testing different aspects of it. It is an AMAZING FACT that ALL of them produce a consistent result! This fact seems to be severely overlooked! I mean, think about it for second! It is astounding that each of these experiments that close each of the different loopholes produce the SAME, IDENTICAL result, and not only that, the result having such HIGH CONFIDENCE (the Ansmann et al. experiment, for example, produced a result that exceeded 244 standard deviations!. It's not even funny!
I am trying hard to understand how your reasoning is better than the following:
Euclidian geometry on a plane is wrong because it proves that the sum of the angles is 180 degrees. Experiment shows, however, that this is wrong 1) for quadrangles on a plane and 2) for triangles on a sphere.
Sorry, I just cannot understand how this is different from what you want me to accept. The Bell theorem states that LHV theories cannot violate some inequalities under some assumptions. All you’re telling me is experiment demonstrates violations when these assumptions are not satisfied. ZapperZ, I do sincerely respect you for knowledge and patience, so it is with great regret that I have to say that I’m less than impressed.
ZapperZ said:
I can understand if there are some indications from some experiment somewhere that a test has produced something to the contrary. The FACT that even this doesn't even exist, and yet, there are people here who are somehow CONVINCED, for some odd reason, that this whole thing is "wrong" (which is a very strong word), now THAT is utterly baffling.

Zz.
I tried to explain why unitary evolution contradicts the projection postulate. I used purely mathematical arguments. For some reason, you don’t challenge the specific reasoning I used. If you do challenge it, please advise. So on a purely mathematical level these predictions of standard quantum mechanics contradict each other. Therefore, strictly speaking, one of them wrong. Yes, this is a strong word, but I am afraid you’re trying to kill the messenger again. I did not invent unitary evolution. I did not invent the projection postulate. It’s not my fault that they contradict each other. Even if I die of West Nile fever tomorrow :-), they won’t stop contradicting each other.
 
  • #67
akhmeteli said:
I am trying hard to understand how your reasoning is better than the following:
Euclidian geometry on a plane is wrong because it proves that the sum of the angles is 180 degrees. Experiment shows, however, that this is wrong 1) for quadrangles on a plane and 2) for triangles on a sphere.
Sorry, I just cannot understand how this is different from what you want me to accept. The Bell theorem states that LHV theories cannot violate some inequalities under some assumptions. All you’re telling me is experiment demonstrates violations when these assumptions are not satisfied. ZapperZ, I do sincerely respect you for knowledge and patience, so it is with great regret that I have to say that I’m less than impressed.

I tried to explain why unitary evolution contradicts the projection postulate. I used purely mathematical arguments. For some reason, you don’t challenge the specific reasoning I used. If you do challenge it, please advise. So on a purely mathematical level these predictions of standard quantum mechanics contradict each other. Therefore, strictly speaking, one of them wrong. Yes, this is a strong word, but I am afraid you’re trying to kill the messenger again. I did not invent unitary evolution. I did not invent the projection postulate. It’s not my fault that they contradict each other. Even if I die of West Nile fever tomorrow :-), they won’t stop contradicting each other.

I was addressing your complaint regarding the loopholes, as in the detection loopholes.

If you think there is a logical inconsistencies in the Bell theorem itself, then I would love to see you stick your neck out and publish it. Complaining about it on here does no one any good, does it?

Zz.
 
  • #68
ZapperZ said:
I was addressing your complaint regarding the loopholes, as in the detection loopholes.
Yes, but you also did something else. You reproached me for the strong word “wrong”. I used this word for the assumptions of the Bell theorem only, so I assumed you challenged that part of my post as well.
ZapperZ said:
If you think there is a logical inconsistencies in the Bell theorem itself, then I would love to see you stick your neck out and publish it. Complaining about it on here does no one any good, does it?

Zz.
I am not sure I quite understand that. I don’t see what I can publish – I am not sure I said anything original. The assumptions of the Bell theorem are well-known. The problem of measurement in QM is well-known. The results of the experiments on the Bell inequalities are well-known and are not a matter of dispute – only their interpretation may be controversial. I did not present any independent research, just summarized some pretty well-known results. You don’t seem to dispute the factual aspects of my posts, only my interpretation.
As for my posts doing or not doing any good… I don’t know. I can imagine they do not do you any good, as you know everything this without me. However, we are not the only people on this forum, and I hope some of them may find my posts more useful than you do. You see, people keep saying in this forum that the Bell theorem and the relevant experiments rule out local realism. I present some arguments trying to explain that the situation is somewhat more complex. I am not sure that is just an unwanted distraction for participants of the forum. If, however, you, as a mentor, are telling me to keep my opinions to myself… Well, it’s certainly your right, you are the boss.
 
  • #69
DrChinese said:
1. You can have an unfair sample (of the universe of photon pairs), but it still must be realistic! There must be 3 simultaneous values for Alice at 0, 45, 67.5. Otherwise you are just saying it is a realistic model when it isn't. That is the point of Bell.
There are of course 3 simultaneous values for Alice at 0, 45, 67.5 - they are calculated independently for Alice and Bob. But it does not mean that all pairs are detected at 0 deg.

Let me illustrate this. We have photon pair that have the same POL value but it is off by 45 deg from polarizers of Alice and Bob. Depending from PH value photons are detected or not. But PH value for photons in pair is different (according to model) so depending from PH values of photons both of them could be detected or only one photon from pair can be detected (no coincidence) or both photons can be undetected (this case can not result in detected coincidence if we manipulate only Bob's polarizer or only Alice's polarizer).
Let's say we detected Bob's photon but didn't Alice's. Now we turn Alice's polarizer by 45 deg and sure enough now we detect Alice's photon and we have coincidence that didn't showed up at 0 deg measurement.

So you don't detect all relevant pairs (for possible 45 and 67.5 coincidences) at 0 deg according to model.
 
  • #70
ZapperZ said:
1. Bell-violating experiments using light.

Now, everyone agrees that the plethora of experiments (and let's be honest here, there have been PLENTY OF THEM with NOT ONE SINGLE evidence pointing to the contrary) that showed violation of Bell, etc. inequality, even for multipartite system. ALL of these experiments closed the locality loophole. However, the biggest complaint here is the detection loophole, and somehow, there are still people not convinced of the results of the experiments (even though ALL of them give the IDENTICAL conclusion).
Yes that thing get me puzzling about it. So I was looking what is common for all these experiments.
And you know I think I know one thing that is common for them. You have to keep coincidence detection rate as low as reasonably possible for minimum correlation settings.
That is reasonable because this is indicator how pure is entanglement. Isn't it so?
So the question is whether there can be constructed LHV models that restore local realism if quasi decoherence takes place but is taken away by unintentionally biased settings. And I just gave one such model.
 
  • #71
zonde said:
There are of course 3 simultaneous values for Alice at 0, 45, 67.5 - they are calculated independently for Alice and Bob. But it does not mean that all pairs are detected at 0 deg.

Let me illustrate this. We have photon pair that have the same POL value but it is off by 45 deg from polarizers of Alice and Bob. Depending from PH value photons are detected or not. But PH value for photons in pair is different (according to model) so depending from PH values of photons both of them could be detected or only one photon from pair can be detected (no coincidence) or both photons can be undetected (this case can not result in detected coincidence if we manipulate only Bob's polarizer or only Alice's polarizer).
Let's say we detected Bob's photon but didn't Alice's. Now we turn Alice's polarizer by 45 deg and sure enough now we detect Alice's photon and we have coincidence that didn't showed up at 0 deg measurement.

So you don't detect all relevant pairs (for possible 45 and 67.5 coincidences) at 0 deg according to model.

I am OK with you not detecting all of the relevant pairs (because you have a subset). But for subset of the ones you DO detect, you should be able to see the values for all 3 angles. That is the essence of realism.
 
  • #72
akhmeteli said:
DrChinese.
Thank you very much for a prompt a detailed reply. Let me try to comment.

1. Indeed, if your theory has two contradictory conclusions, or assumptions, that means the theory is, strictly speaking, wrong. By the way, for this very reason quantum theory, in the specific form used to prove the Bell theorem, is, strictly speaking, wrong. Mathematically wrong. It does contain two contradictory assumptions. One of these assumptions must be wrong – logic does not allow any other conclusion. Specifically, I believe that unitary evolution (UE) is right, and the projection postulate (PP) is, strictly speaking, wrong. This is just my opinion, so you may agree or disagree, but you just cannot have both UE and PP, for the simple reason that they contradict each other, and you don’t seem to dispute that. If you do, please advise. In the following I won’t repeat this caveat and will assume that it is PP that is wrong. PP may be a good approximation, it may be a very good approximation, it may be an excellent approximation, it may be an amazingly great approximation, but the bottom line is it’s just an approximation. It just cannot be precise, because if it is, then UE has its share of problems.

2. Maybe I don’t quite understand you, or my English fails me, but I don’t quite see what is wrong about going against the weakest argument of the opponent. I would think in any contest the opponent’s weakest point is fair game. Furthermore, we are not in a court room, I think we both are just trying to understand something better, so I would think we should just agree with each other’s strongest argument, rather than waste time refusing to concede what we believe is actually correct in the opponent’s arguments.

3. I don’t quite get it. Such people as Shimony and Zeilinger, who are no fans of LR, admit that the “detection loophole” (and, consequently, the fair sampling assumption) presents a serious problem (see the relevant quotes at https://www.physicsforums.com/showpost.php?p=1702189&postcount=13 and https://www.physicsforums.com/showpost.php?p=1705826&postcount=65 ). Do you really believe we should accept the fair sampling assumption without discussion? You yourself gave an example where this assumption may be less than obvious – “An example would be celestial objects used as "standard candles".” I guess the following reasoning by Santos makes some sense: “In the context of LHV theories the fair sampling assumption is, simply, absurd. In fact, the starting point of any hidden variables theory is the hypothesis that quantum mechanics is not complete, which essentially means that states which are considered identical in quantum theory may not be really identical. For instance if two atoms, whose excited states are represented by the same wave-function, decay at different times, in quantum mechanics this fact may be attributed to an ”essential indeterminacy”, meaning that identical causes (identical atoms) may produce different effects (different decay times). In contrast, the aim of introducing hidden variables would be to explain the different effects as due to the atomic states not being really identical, only our information (encapsuled in the wave-function) being the same for both atoms. That is, the essential purpose of hidden variables is to attribute differences to states which quantum mechanics may consider identical. Therefore it is absurd to use the fair sampling assumption -which rests upon the identity of all photon pairs- in the test of LHV theories, because that assumption excludes hidden variables a priori.”

4. I agree, there are “new and improved Bell tests every year”. However, so far the result is always the same: no violation of the genuine Bell inequalities. For some reason there is always something: either the detection loophole, or locality loophole, you name it. 45 years and counting – no violations. That reminds me the following words from Heller’s “Catch-22”:
"I've got just the twelve-year-old virgin you're looking for," he announced jubilantly. "This twelve-year-old virgin is really only thirty-four, but she was brought up on a low-protein diet by very strict parents and didn't start sleeping with men until"

5. I don’t know, I fail to see how entanglement can eliminate LR, as existence of entanglement is not enough to prove the Bell theorem. You need the projection postulate. You are a knowledgeable person, so I am sure you appreciate that “entanglement of particles that are outside of each other's light cones” per se does not eliminate LR. In general, the only thing that could be fatal to LR is genuine BIV (that is, if we forget about superdeterminism). So far genuine BIV have not been demonstrated, and I don’t hold my breath.

I don’t get it. I specifically indicated the two mutually contradictory assumptions that are both predictions of QM and necessary to prove the Bell theorem. So while I could agree that “If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions.”, this is not relevant, because the proof of the Bell theorem is indeed based on two mutually contradictory assumptions, and I specifically indicated that, showing where the proof uses UE and PP. As for the cos^2(theta) rule, when you use it for both particles of the singlet, I believe you need the projection postulate (to count the QM correlations), and PP directly contradicts UE.

6. I don’t know. My impression was that the Copenhagen interpretation’s grip on physics was much stronger then than now. But I may be mistaken.

7. Again, entanglement does not eliminate LR. And Einstein is no relative of mine. It is my understanding he opposed the uncertainty principle. So he was wrong on this issue (at least I believe so). But the uncertainty principle per se does not eliminate LR either. On the other hand, Einstein’s EPR paper led to significant progress.

8. I readily admit that I don’t know much about GHZ, Leggett etc., but I suspect they basically have the same problems as the Bell theorem. For example, I have not heard anybody state that they were successfully used to conduct loophole-free experiments eliminating LR.

My assessment is there are neither no-go theorems nor experimental data eliminating LR. But I certainly respect your point of view.

1. QM is not considered self contradictory, although a lot of folks don't like the collapse rules. But that is 100% irrelevant to Bell's Theorem, which merely points out that the predictions of QM and LR are different in specific areas. One has nothing to do with the other, and it is plain wrong to say "Bell is inconsistent because QM is inconsistent".

2. The answer is that it doesn't convince anyone. Which explains why the LR position is completely ignored professionally except by Santos and a few others.

3. True, they have elevated the detection loophole to a higher status. They even published a paper with Santos on the subject. For the reasons ZapperZ explained about loopholes above, I respectfully disagree with their assessment; but I understand their position as being for the sake of bringing a final and complete end to the "loopholes" discussion. I think Santos' statement you quote is ridiculous, I have seen it before and it always makes me mad. No one is a priori ignoring hidden variables. If they existed, context free, they should be noticable and yet they never are. There is absolutely NOTHING about the setups that can be said to select a subset which is biased in any way. If such bias occurs, it must be natural and subtle (like my standard candles example). The problem with that approach is that even then, there is NO known way to get the Bell results from a biased LR sample... as we see with Santos' repeated failures. And as detection efficiency improves: the Bell result simply gets stronger in complete violation of LR predictions. And finally, there is substantial independent corroboration from other experiments.

4. You are completely wrong again, the violations are there every time. The thing you ignore is called the scientific method. There is no requirement in the method - EVER - that all loopholes be closed simultaneously to accept the results of an experiment. I would say in fact that this almost NEVER occurs in any scientific experiment. The normal technique is to vary one variable at a time and chart relationships. That is why science accepts the Bell test results. If everyone stuck their heads in the ground until "perfect" experiments were done (as you seem to suggest), we would have no science at all.

5. Now you are just trying to be contradictory. You say that correlations outside of Alice and Bob's light cones are within the scope of LR? As far as I know, there has not been any attempt by a local realist to address that one. Once again, your argument circles back to "I ignore all evidence in contradiction to my viewpoint" even though this one completely contradicts every possible LR perspective.

6. The local realistic school, of which Einstein was a member, is virtually non-existent now. So you are wrong again. QM has more interpretations now, but they are all either non-local or non-realistic.

7. Of course entanglement refutes LR. That is by definition! Or more precisely, LR flatly predicts that entanglement does not exist (correlations are spurious).

8. As with Bell, the other no-gos compare the predictions of LR with the predictions of QM. They use different techniques, and they are generally not statistical. They are instead considered "all-or-nothing" and soundly support QM. I guess you will next tell us that is even more support for LR because QM is contradictory and should not be supported.

You see, your starting premise - that QM is contradictory - flies in the face of the science of the last 100 years. While you see problems, everyone else is using the theory to make new predictions and new advances. That is because QM is useful. Now, is it also true? That is not a scientific question, it is a philosophical one. QM is a model, and should not be confused with reality. See my tag line below.
 
Last edited by a moderator:
  • #73
akhmeteli said:
The problem of measurement in QM is well-known.

Apparently not as well known as you seem to think. I probably saw 10 papers last year on that subject (measurement problems), compared to perhaps 1000 on entanglement. So I would say the problem you identify is much less of a problem for the practicing physicist than you suggest.

Why don't you start a separate thread on the subject? Then we could discuss the evidence for your perspective.
 
  • #74
zonde said:
Well, it took some time
...
Probability that there is photon with given values of hidden variables:
abs(sin(2*ph))

Polarization of photon i.e. will it pass the polarizer or not (+ sign for it will pass):
sign(sin(alpha + pol)^2 - cos(ph)^2)
this function actualy determines whether polarizer angle falls in "passing" interval or in "absorbing" interval of photon so it can be described with intervals without using sine and cosine functions.

Detection (+ sign for it will be detected):
sign(cos(ph)^2-K) where K=sin(Pi/8)^2
again determines whether "ph" falls in certain interval and so can be described without cosine function
...
I have tried to incorporate your model in simulation program (basically I have used the one from de Raet mentioned earlier and made it more object oriented). Would you say the code below represents your proposal for the effect of the filter on a particle (I hope it is clear enough)?

Code:
       private void ParticleHitfromZonde(Particle Particle)
        {
            bool Pass = true;
            double HvProbability = Math.Abs(h.Sin(Particle.StaticPhaseDifference)); //Calculate HvProbability

            if (HvProbability < h.GetRandom())                                      //Get a random value between 0 and 1 
                                                                                    // and check whether HvProbability is lower
            {
                Pass=false;
            }
            if (Pass)
            {
                //user other proposed formulas:
                int WillItPass = Math.Sign(h.SinSquare(this.Angle + Particle.Polarization) - h.CosSquare(Particle.StaticPhaseDifference));
                int Detection = Math.Sign(h.CosSquare(Particle.StaticPhaseDifference) - h.SinSquare(h.PiOver8));
                if (WillItPass < 0 || Detection < 0)
                {
                    Pass = false;
                }
            }
            Particle.Absorbed = !Pass;                                              //Absorbed is opposite of pass

        }
(this.Angle is the angle of the polarization filter)
 
Last edited:
  • #75
ajw1 said:
I have tried to incorporate your model in simulation program (basically I have used the one from de Raet mentioned earlier and made it more object oriented). Would you say the code below represents your proposal for the effect of the filter on a particle (I hope it is clear enough)?

Code:
       private void ParticleHitfromZonde(Particle Particle)
        {
            bool Pass = true;
            double HvProbability = Math.Abs(h.Sin(Particle.StaticPhaseDifference)); //Calculate HvProbability

            if (HvProbability < h.GetRandom())                                      //Get a random value between 0 and 1 
                                                                                    // and check whether HvProbability is lower
            {
                Pass=false;
            }
            if (Pass)
            {
                //user other proposed formulas:
                int WillItPass = Math.Sign(h.SinSquare(this.Angle + Particle.Polarization) - h.CosSquare(Particle.StaticPhaseDifference));
                int Detection = Math.Sign(h.CosSquare(Particle.StaticPhaseDifference) - h.SinSquare(h.PiOver8));
                if (WillItPass < 0 || Detection < 0)
                {
                    Pass = false;
                }
            }
            Particle.Absorbed = !Pass;                                              //Absorbed is opposite of pass

        }
(this.Angle is the angle of the polarization filter)

A couple of questions:

1. Each particle has properties .StaticPhaseDifference and .Polarization - are there any others?
2. Also, is .Polarization randomly assigned or similar?
3. What about .StaticPhaseDifference? How is its value assigned?

I want to follow the analogy myself because I am concerned about sleight of hand that subtly puts in a non-local factor.
 
  • #76
DrChinese said:
A couple of questions:

1. Each particle has properties .StaticPhaseDifference and .Polarization - are there any others?
2. Also, is .Polarization randomly assigned or similar?
3. What about .StaticPhaseDifference? How is its value assigned?

I want to follow the analogy myself because I am concerned about sleight of hand that subtly puts in a non-local factor.

1. Currently I have no other properties for the particles except .Absorbed and .DelayTime (the delaytime is used for the de Raet model)
2 and 3. On creation, both the properties get a random value between 0 and 2pi. Then the properties of the second one is then related to that of the first one as specified by Zonde

Code:
    public class Particle
    {
        public Particle()
        {
            this.Polarization = h.GetRandomTwoPiAngle();
            this.StaticPhaseDifference = h.GetRandomTwoPiAngle();
        }


Code:
                //Initiate entangled particles
                Particle Particle1 = new Particle();
                Particle Particle2 = new Particle();

                //polarization relation
                Particle2.Polarization = Particle1.Polarization + h.PiOver2; // polarization of particle 2

                //Zonde
                Particle2.StaticPhaseDifference = Particle1.StaticPhaseDifference + h.PiOver4; // polarization of particle 2
I can attach all the classes, or, when you have access to a Visual Studio environment, the complete project
 
Last edited:
  • #77
You have Particle2.StaticPhaseDifference = Particle1.StaticPhaseDifference + h.PiOver4;

Is the piOver4 correct? Or is it supposed to be piOver2 as is polarization? That may be OK, I want to make sure though. It seems strange not to make them identical, when that is the premise.
 
  • #78
DrChinese said:
You have Particle2.StaticPhaseDifference = Particle1.StaticPhaseDifference + h.PiOver4;

Is the piOver4 correct? Or is it supposed to be piOver2 as is polarization? That may be OK, I want to make sure though. It seems strange not to make them identical, when that is the premise.
Yes, you're probably right:
zonde said:
... Here I just hypothesize that "phase" vectors are orthogonal for entangled pair...
The spreadsheet code however seems to be using PiOver4, or maybe I am missing something...
 
  • #79
I tried to follow the De Raedt example from the web site, but they hide their algorithm and dataset. I see some formulas here and there but how do you know what to do unless you can see the code? It should be very simple/straightforward - like yours - but I cannot find it. I do see a downloadable app but it is an EXE so I probably won't see how the data is generated. Oh well, I guess I will check it out.
 
  • #80
DrChinese said:
I tried to follow the De Raedt example from the web site, but they hide their algorithm and dataset. I see some formulas here and there but how do you know what to do unless you can see the code? It should be very simple/straightforward - like yours - but I cannot find it. I do see a downloadable app but it is an EXE so I probably won't see how the data is generated. Oh well, I guess I will check it out.

Their (Fortran) code is at the end of http://rugth30.phys.rug.nl/pdf/COMPHY3339.pdf"
 
Last edited by a moderator:
  • #81
ajw1 said:
Their (Fortran) code is at the end of http://rugth30.phys.rug.nl/pdf/COMPHY3339.pdf"

Excellent, thanks. This should allow me to understand what they are doing. I am working on understanding the code and should be able to provide an explanation of how it works. I should also be able to verify if realism is respected by the model, which is of course a requirement.
 
Last edited by a moderator:
  • #82
ajw1 said:
Their (Fortran) code is at the end of http://rugth30.phys.rug.nl/pdf/COMPHY3339.pdf"

Can you help me decipher this statement:

k2=ceiling(abs(1-c2*c2)**(d/2)*r0/tau) ! delay time

this looks to me like:

k2=ceiling(abs(1-(c2*c2))**((d/2)*(r0/tau))) ! delay time

and since d=2 and static reduces to:

k2=ceiling( abs(1-(c2*c2))**(r0/tau) ) ! delay time

------------------------------------------------------------------------

After examining this statement, I believe I can find an explanation of how the computer algorithm manages to produce its results. It helps to know exactly how the bias must work. :smile: The De Raedt et al model uses the time window as a method of varying which events are detected (because that is how their fair sampling algorithm works). That means, the time delay function must be - on the average - such that events at some angle settings are more likely to be included, and events at other angle setting are on average less likely to be included. It actually does not matter what physical model they propose, because eventually they must all accomplish the same thing. And that is: the bias function must account for the difference between the graphs of the QM and LR correlation functions.

Which is simply that we want the difference between the LR correlation function and the QM correlation function to be zero at 0, 45, 90, 135 degrees. That is because there is no difference in the graphs at those angles. But there is a difference at other angles. That same difference must be positive and maximum at angles like 22.5, 157.5 etc, and be negative and minimum at angles like 67.5 and 112.5 etc. (Or maybe vice versa :smile: )

So we need an embedded bias function that has those parameters, and if their computer program is to work, we will be able to find it. Once we find it, we can then assess whether it truly models the actual experimental data. If we see it does, they win. Otherwise, they lose. Of course, my job is to challenge their model. First, I must find out how they do it.

So we know that their function must: i) alternate between positive and negative bias, ii) it must have zero crossings every 45 degrees (pi/4), and iii) it must have a period of 90 degrees (pi/2). It does not need to be perfect, because the underlying data isn't going to be perfect anyway. Any of this starting to look familiar? Why yes, that is just the kind of thing we saw in zonde's model.
 
Last edited by a moderator:
  • #83
So now, per my prior post on the De Raedt model:

Let's assume I can demonstrate how the bias function uses the delay to do its work (by affecting which events are within the time window and therefore counted). The next question is: does it model all of the data of relevant Bell tests? Well, yes and no. Obviously they claim to produce QM-like data as far as was reported - YES in this regard. But most likely we will see that the traditional Bell test experimenters did not consider this clever twist - some perhaps NO in some way. It should be possible to extend the actual experiments to show whether the De Raedt model is accurate or not. In fact, I believe I can show this without performing an experiment once I run their algorithm myself.

I think I can safely give the De Raedts an A for coming up with a simulation that works as it does. As I have said previously, a simulation which produces a QM-like result is NOT the same as a local realistic theory. So such a simulation - ALONE and BY ITSELF - is NOT a disproof of the Bell Theorem. Because there are additional consequences of any local realistic theory, and if those are not considered then it cannot be a candidate. Again, this is why Santos has failed with stochastic models.
 
  • #84
DrChinese said:
Can you help me decipher this statement:

k2=ceiling(abs(1-c2*c2)**(d/2)*r0/tau) ! delay time

this looks to me like:

k2=ceiling(abs(1-(c2*c2))**((d/2)*(r0/tau))) ! delay time

and since d=2 and static reduces to:

k2=ceiling( abs(1-(c2*c2))**(r0/tau) ) ! delay time


The model produces the expected results when you use only d/2 for the power (exponent):
(1-c22) d/2
and then muliply with r0/tau (I had the same problem, so I used a fortran debugger to check the calculation. Attached a graph of my simulation. Green is when one only assumes Malus for the photons (without timetag))
 

Attachments

  • eprb.jpg
    eprb.jpg
    23.6 KB · Views: 436
Last edited:
  • #85
ajw1 said:
The model produces the expected results when you use only d/2 for the power:
(1-c22) d/2
and then muliply with r0/tau (I had the same problem, so I used a fortran debugger to check the calculation. Attached a graph of my simulation. Green is when one only assumes Malus for the photons (without timetag))

So verifying that we: DO multiply the entire result by r0/tau, and do NOT multiply the exponent d/2 by r0/tau?

Graph looks great by the way.
 
  • #86
DrChinese said:
So verifying that we: DO multiply the entire result by r0/tau, and do NOT multiply the exponent d/2 by r0/tau?
That is correct
 
  • #87
ajw1 said:
That is correct

Thanks. I should have some more soon.
 
  • #88
DrChinese said:
I am OK with you not detecting all of the relevant pairs (because you have a subset). But for subset of the ones you DO detect, you should be able to see the values for all 3 angles. That is the essence of realism.
Yes of course. What else I can say.
Detection values are calculated separately for Alice and Bob and coincidences again are calculated row by row taking only one value from Bob's data and one value from Alice's data. Final result is sum of coincidence values in all rows. All rows produce some result either 0 or some positive value.

I suppose that only satisfactory answer for you is to test the model yourself.
 
  • #89
ajw1 said:
I have tried to incorporate your model in simulation program (basically I have used the one from de Raet mentioned earlier and made it more object oriented). Would you say the code below represents your proposal for the effect of the filter on a particle (I hope it is clear enough)?

Code:
            double HvProbability = Math.Abs(h.Sin(Particle.StaticPhaseDifference));
It seems to me that there is missing *2 in that row (should be Particle.StaticPhaseDifference*2, not sure about syntax). Everything else seems ok.
 
  • #90
DrChinese said:
4. You are completely wrong again, the violations are there every time. The thing you ignore is called the scientific method. There is no requirement in the method - EVER - that all loopholes be closed simultaneously to accept the results of an experiment. I would say in fact that this almost NEVER occurs in any scientific experiment. The normal technique is to vary one variable at a time and chart relationships. That is why science accepts the Bell test results. If everyone stuck their heads in the ground until "perfect" experiments were done (as you seem to suggest), we would have no science at all.
Good argument about addressing loopholes separately. But for that to work experiments should be basically the same. That is not the case with violations of Bell inequalities.
Another good method is to vary certain parameter in question and analyze how results depend from this parameter.
So in this case it would be good to see photon experiments where detection efficiency is varied and coincidence rate (along with correlations) is analyzed. And for experiments with efficient detection distance between two entities would be the varying parameter.
But of course experiments like that would be quite challenging because of additional errors that should be taken into account when parameter in question is varied (in case of photons and even more challenging for efficient detection) so we might not see them soon if ever.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
Replies
41
Views
6K
Replies
14
Views
2K
Replies
1
Views
2K
  • · Replies 61 ·
3
Replies
61
Views
4K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
15
Views
3K
Replies
58
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K