Photon entanglement and fair sampling assumption

  • #51
ajw1 said:
The code seems different from the formulas mentioned earlier: I miss the Cos() functions, or am I overlooking something?
Well it's statistically the same as the one mentioned earlier.
ajw1 said:
Is the first SIGN(.. function in the C column, and the second in E?
Well, I copied one of the functions with different polarizator angles (at top of the column) in multiple columns so that multiple angles can be calculated at once.
Just adjust column indexes accordingly.
 
Physics news on Phys.org
  • #52
zonde said:
1. If for you QM starts and ends at non-locality then yes, I am challenging QM.

2. About testing with different angles. I am drawing graph using 32 different angles - is it enough?

3. What is the second critical test? Type II PDC? Simple thing - just make a difference of 90 deg. between POL hidden variables of Alice and Bob with PH keeping the same difference as before.

4. But it seems to me that you are not caching the meaning about this fair sampling assumption thing. ... But if fair sampling assumption does not hold there are plenty of possible ways how to construct such models that violate Bell's inequalities and it does not matter if experimental results justify them or not.
Bell's inequalities does not prove anything if fair sampling assumption does not hold. And sorry but this IS mainstream view.

1. I only advocate the position that local hidden variable theories are not tenable. I do not advocate non-locality in particular.

2. 32 is NOT enough. Unless of course you are talking about SIMULTANEOUS values. I want to see 3 *simultaneous* values for angles settings I choose. Preferably either 0/120/240 or 0/45/67.5 as these are the most often used examples. If you can only show 2 at a time, you don't have a LHV theory. Sorta like taking the magician at his word when he says, "nothing up my sleeve"...

3. You haven't explained Type II PDC by saying the crystal is rotated in your formula sheet. For a local realist to explain Type II PDC it will take a lot. The reason is that there is NO 360 degree polarization entanglement from a single crystal. The output of either alone lacks that characteristic! Only when the outputs are superimposed does this feature emerge! If the photons emerged from one or the other but not both (as a local realist would insist), then the entanglement is not explainable.

4. Yes, indeed it is the mainstream view that IF the fair sampling assumption were discovered NOT to hold, then Bell test results might be invalidated. So what? If next week the speed of light were discovered to be 4 kilometers per second then a lot of other science might be invalidated too. Fair sampling is the cornerstone of science, my friend, and has absolutely nothing to do with Bell tests in particular. Obviously, there are some cases in which the data points are relatively few and far between and there may in fact be a natural bias. An example would be celestial objects used as "standard candles". But you can't model a theory (such as a LHV) that runs counter to the data and explain it as "supported" and "consistent".
Do you have some data to share? You saw my requested angle settings. Just show me data for 3 simultaneously and we can get on with the main show here.
 
  • #53
DrChinese said:
1. I only advocate the position that local hidden variable theories are not tenable. I do not advocate non-locality in particular.
But you do not separate whether LHV theories are viewed as as interpretation of QM or as entirely different things?

DrChinese said:
2. 32 is NOT enough. Unless of course you are talking about SIMULTANEOUS values. I want to see 3 *simultaneous* values for angles settings I choose. Preferably either 0/120/240 or 0/45/67.5 as these are the most often used examples. If you can only show 2 at a time, you don't have a LHV theory. Sorta like taking the magician at his word when he says, "nothing up my sleeve"...
I suppose you mean that different angles are calculated with the same set of randomly generated HV. Right?
For relative angles 0/45/67.5 and three different angles for Alice (0,45,90) with the same set of HV (65534 rows):
1701/813/227
1654/851/266
1646/805/242

DrChinese said:
3. You haven't explained Type II PDC by saying the crystal is rotated in your formula sheet. For a local realist to explain Type II PDC it will take a lot. The reason is that there is NO 360 degree polarization entanglement from a single crystal. The output of either alone lacks that characteristic! Only when the outputs are superimposed does this feature emerge! If the photons emerged from one or the other but not both (as a local realist would insist), then the entanglement is not explainable.
Just to understand the question - do you say that model should explain not only measurement process but generation process at source as well for you to consider the model? If that's so it surely will take some time.

DrChinese said:
4. Yes, indeed it is the mainstream view that IF the fair sampling assumption were discovered NOT to hold, then Bell test results might be invalidated. So what? If next week the speed of light were discovered to be 4 kilometers per second then a lot of other science might be invalidated too. Fair sampling is the cornerstone of science, my friend, and has absolutely nothing to do with Bell tests in particular. Obviously, there are some cases in which the data points are relatively few and far between and there may in fact be a natural bias. An example would be celestial objects used as "standard candles". But you can't model a theory (such as a LHV) that runs counter to the data and explain it as "supported" and "consistent".
Yes, but there are reasons for that particular cornerstone. You can never test everything so you have to make some assumptions to move forward. You have to disregard some information in favor of other information you want to know.
But it does not mean that assumptions can not be revisited later including fair sampling assumption in some particular context.
 
  • #54
zonde said:
1. But you do not separate whether LHV theories are viewed as as interpretation of QM or as entirely different things?


2. I suppose you mean that different angles are calculated with the same set of randomly generated HV. Right?
For relative angles 0/45/67.5 and three different angles for Alice (0,45,90) with the same set of HV (65534 rows):
1701/813/227
1654/851/266
1646/805/242


3. Just to understand the question - do you say that model should explain not only measurement process but generation process at source as well for you to consider the model? If that's so it surely will take some time.

1. I think an LHV will not be an interpretation, it will be a different theory.

2. This needs to be discussed separately because we are getting close to the crux. I believe I understand your numbers as follows:

1701/813/227 means coincidences for 0, 45 and 67.5 degrees (relative to Bob), where Alice is oriented at 0 degrees. Or?

3. My point is that local realists struggle to prove Bell/Aspect wrong, failing to realize that their hypothesis is elsewhere contradicted. That is why the Bell Theorem states that no LHV theory can reproduce ALL of the predictions of quantum theory. Once you understand the full implications of the requirement, it becomes a much larger issue to overcome. That is why Santos, Hess and others have failed, because they have stumbled in postulating a full and consistent LHV hypothesis that actually leads to the predictions of QM.

Explaining Type II PDC is not simple for an LHV theory, so we should take it off the table for now. But that doesn't mean it isn't there.
 
Last edited:
  • #55
DrChinese said:
1. I think an LHV will not be an interpretation, it will be a different theory.
Even completely different from pilot-wave interpretation?
I will disagree.

DrChinese said:
2. This needs to be discussed separately because we are getting close to the crux. I believe I understand your numbers as follows:

1701/813/227 means coincidences for 0, 45 and 67.5 degrees (relative to Bob), where Alice is oriented at 0 degrees. Or?
Yes

DrChinese said:
3. My point is that local realists struggle to prove Bell/Aspect wrong, failing to realize that their hypothesis is elsewhere contradicted. That is why the Bell Theorem states that no LHV theory can reproduce ALL of the predictions of quantum theory. Once you understand the full implications of the requirement, it becomes a much larger issue to overcome. That is why Santos, Hess and others have failed, because they have stumbled in postulating a full and consistent LHV hypothesis that actually leads to the predictions of QM.
But Bell Theorem does not cover ALL of the predictions of quantum theory. So it is just declarative statement without too much behind it.

DrChinese said:
Explaining Type II PDC is not simple for an LHV theory, so we should take it off the table for now. But that doesn't mean it isn't there.
It seems that you get polarization entanglement when contexts (pilot waves) of two photons overlap in a certain way. I found this experiment as a very nice and simple demonstration of polarization entanglement creation: http://arxiv.org/abs/0912.1275"
 
Last edited by a moderator:
  • #56
zonde said:
Even completely different from pilot-wave interpretation?
I will disagree.[/URL]

Pilot wave is not a LHV! L=Local, Pilot wave is non-local. So I am not sure of what you mean. If you are trying to say that a non-local hidden variable interpretation is possible: I would agree and there are lots of supporters of that perspective. That perspective is also considered contextual.
 
  • #57
zonde said:
But Bell Theorem does not cover ALL of the predictions of quantum theory. So it is just declarative statement without too much behind it.
[/URL]

This too is strange. If there was not much behind it, why is it so important? There are over 1000 papers published annually on the subject. In fact, there is enough behind it to be accepted as proven.
 
  • #58
zonde said:
2. This needs to be discussed separately because we are getting close to the crux. I believe I understand your numbers as follows:

1701/813/227 means coincidences for 0, 45 and 67.5 degrees (relative to Bob), where Alice is oriented at 0 degrees.

Reply: Yes

Okay. So out of the SAME 1701 trials mentioned above, there was ALSO coincidences of 813 for 45 degrees and 227 for 67.5 degrees. Correct? (I am not interested in separate trials for the 3 angles because we are not testing the realism requirement in such case.)
 
  • #59
zonde said:
3. You haven't explained Type II PDC by saying the crystal is rotated in your formula sheet. For a local realist to explain Type II PDC it will take a lot. The reason is that there is NO 360 degree polarization entanglement from a single crystal. The output of either alone lacks that characteristic! Only when the outputs are superimposed does this feature emerge! If the photons emerged from one or the other but not both (as a local realist would insist), then the entanglement is not explainable.

OOPS! :redface:

I wrote Type II PDC and I meant Type I. Sorry for any confusion this caused.
 
  • #60
DrChinese said:
Pilot wave is not a LHV! L=Local, Pilot wave is non-local. So I am not sure of what you mean. If you are trying to say that a non-local hidden variable interpretation is possible: I would agree and there are lots of supporters of that perspective. That perspective is also considered contextual.
If pilot wave will turn from non-local into local I assume that Pilot wave interpretation will not suffer much. That is what I mean.

DrChinese said:
This too is strange. If there was not much behind it, why is it so important? There are over 1000 papers published annually on the subject. In fact, there is enough behind it to be accepted as proven.
I thought about this a bit and it seems to me that meaning of particular statement is that LHV theories might explain all peridictions of QM except entanglement. And in that case it's only about entanglement after all.

DrChinese said:
Okay. So out of the SAME 1701 trials mentioned above, there was ALSO coincidences of 813 for 45 degrees and 227 for 67.5 degrees. Correct? (I am not interested in separate trials for the 3 angles because we are not testing the realism requirement in such case.)
No. Otherwise we are not discussing unfair sampling.

DrChinese said:
OOPS! :redface:

I wrote Type II PDC and I meant Type I. Sorry for any confusion this caused.
Well it turned out that as a result I cleared some misunderstanding of mine. I found out that I had wrong picture about Type I PDC as direct source of polarization entangled photon pairs.
So in case of Type I PDC if we talk about polarization entanglement there have to be some more details about the setup how produced (polarization non-entangled) photons are turned into polarization entangled photons.
 
  • #61
zonde said:
1. No. Otherwise we are not discussing unfair sampling.


2. So in case of Type I PDC if we talk about polarization entanglement there have to be some more details about the setup how produced (polarization non-entangled) photons are turned into polarization entangled photons.

1. You can have an unfair sample (of the universe of photon pairs), but it still must be realistic! There must be 3 simultaneous values for Alice at 0, 45, 67.5. Otherwise you are just saying it is a realistic model when it isn't. That is the point of Bell.

2. Yes, it is difficult to model "realistically". (Pilot wave theorists don't think so, but it is.)
 
  • #62
DrChinese said:
My point is that local realists struggle to prove Bell/Aspect wrong, failing to realize that their hypothesis is elsewhere contradicted. That is why the Bell Theorem states that no LHV theory can reproduce ALL of the predictions of quantum theory.

I tend to agree with this wording. However, I am not sure this is bad "news" for LHV and local realists. Without taking sides with local realists or against them here, I tend to think this is actually great "news" for them. The reasoning is as follows (a part of it was offered by nightlight).

1. Predictions of quantum theory include both unitary evolution and the projection postulate.

2. To prove the Bell theorem, one needs both unitary evolution and the projection postulate.

3. Strictly speaking, unitary evolution and the projection postulate directly contradict each other.

4. Inability of LHV theories to reproduce contradictory results is good for local realists.

As some of these points are not obvious, let me explain.

1. This statement seems obvious as far as unitary evolution is concerned. If you disagree that the projection postulate is also a prediction of quantum theory, please advise (I admit that this is not an obvious statement, as it depends on the interpretation of quantum theory. What is important for me, however, is that this postulate or something similar is required to prove the Bell theorem - see below).

2. One needs unitary evolution when one assumes that spin projection on any axis is conserved. One needs the projection postulate to prove that quantum theory violates the Bell inequalities (it is used to compute the correlations in quantum theory).

3. Indeed, the projection postulate necessitates irreversibility, and, strictly speaking, unitary evolution does not allow any irreversibility (let me mention, e.g., the quantum recurrence theorem (Phys. Rev. V.107 #2, pp.337-338, 1957)), so a particle, strictly speaking, does not stay in the eigenstate after measurement (if it was in a superposition before the measurement).

4. Seems obvious
DrChinese said:
Once you understand the full implications of the requirement, it becomes a much larger issue to overcome. That is why Santos, Hess and others have failed, because they have stumbled in postulating a full and consistent LHV hypothesis that actually leads to the predictions of QM.

As I said, maybe it’s good for them that they failed. Interestingly, in a recent article (http://arxiv.org/PS_cache/arxiv/pdf/0912/0912.4098v1.pdf) Santos argues that “the usual postulates of quantum are too strong”. Again, I am not taking sides with Santos or against him here. I believe, however, that, on the one hand, the proof of the Bell theorem uses mutually contradictory assumptions, on the other hand, so far no experiment has demonstrated violations of the Bell inequalities without some dubious additional assumptions, such as “fair sampling”. So I am not sure there are sufficient theoretical or experimental arguments proving that “local hidden variable theories are not tenable.”
 
  • #63
akhmeteli said:
As I said, maybe it’s good for them that they failed. Interestingly, in a recent article (http://arxiv.org/PS_cache/arxiv/pdf/0912/0912.4098v1.pdf) Santos argues that “the usual postulates of quantum are too strong”. Again, I am not taking sides with Santos or against him here. I believe, however, that, on the one hand, the proof of the Bell theorem uses mutually contradictory assumptions, on the other hand, so far no experiment has demonstrated violations of the Bell inequalities without some dubious additional assumptions, such as “fair sampling”. So I am not sure there are sufficient theoretical or experimental arguments proving that “local hidden variable theories are not tenable.”

That is an "interesting" perspective, since you are basically saying failure is good. :smile:

The problem with the LR perspective is that they do not work against the opposition's strongest arguments, they seek the weakest to challenge. I consider fair sampling to be one of the worst possible attacks as the hypothesis is born out of LR anger and frustration and little else. As I have said before, virtually every scientific experiment relies on the fair sampling assumption and there is nothing special about it with respect to a Bell test.

On the other hand, the opposition (which is of course the mainstream) consistently challenge themselves at the highest level. For example, there are new and improved Bell tests every year. Entanglement is being sought - and discovered - in new and usual places. On the other hand, LRists basically deny the existence of entanglement (since they say coincidences are predetermined and not a result of an ongoing state).

So while the LR camp is grasping at straws (that's how it appears to me), I have read papers finding entanglement under every conceivable rock - including entanglement of particles that are outside of each other's light cones! And as predicted by QM.

As to Bell using mutually contradictory assumptions: all Bell is saying is that LR predictions can never match QM. If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions. If QM is shown to be experimentally wrong tomorrow, then so be it. But the predictions of QM are still the predictions of QM, and I don't know anyone who sees any confusion (or contradiction) in the cos^2(theta) rule.

But you are missing a truly important point of Bell: At the time it was introduced, it was widely believed that a local realistic version (a la Einstein's perspective) was tenable. Entanglement had never been witnessed! So maybe QM was wrong. But 45 years later, the story has not played out as Einstein might have imagined.

As to Santos suggesting that QM should be relaxed: yes, I saw that paper and laughed. I mean, who is he kidding? Hey, let's change the rules so Santos can convince himself LR is viable and he is right in the end. It's science, where's the beef Santos? I would love to see Santos stick with a theory for an entire year and use it to discover new sources of entanglement that were previously unknown. That would impress me.

In the meantime, there are numerous NEW theorems that are fully independent of Bell but which ALSO rule out the LR position. Examples are GHZ, Leggett, etc. and guess what: they don't rely on the "unfair" sampling assumption. So the LR position is being left in the dust as science advances. So I guess I am disagreeing with your assessment. LR is not tenable and the evidence is getting stronger, not weaker.
 
  • #64
DrChinese.
Thank you very much for a prompt a detailed reply. Let me try to comment.
DrChinese said:
That is an "interesting" perspective, since you are basically saying failure is good. :smile:
I am not just saying that failure is good in this case, I am also saying why: because “success” would be fatal for the potential “successful” theory. Indeed, if your theory has two contradictory conclusions, or assumptions, that means the theory is, strictly speaking, wrong. By the way, for this very reason quantum theory, in the specific form used to prove the Bell theorem, is, strictly speaking, wrong. Mathematically wrong. It does contain two contradictory assumptions. One of these assumptions must be wrong – logic does not allow any other conclusion. Specifically, I believe that unitary evolution (UE) is right, and the projection postulate (PP) is, strictly speaking, wrong. This is just my opinion, so you may agree or disagree, but you just cannot have both UE and PP, for the simple reason that they contradict each other, and you don’t seem to dispute that. If you do, please advise. In the following I won’t repeat this caveat and will assume that it is PP that is wrong. PP may be a good approximation, it may be a very good approximation, it may be an excellent approximation, it may be an amazingly great approximation, but the bottom line is it’s just an approximation. It just cannot be precise, because if it is, then UE has its share of problems.
DrChinese said:
The problem with the LR perspective is that they do not work against the opposition's strongest arguments, they seek the weakest to challenge.
Maybe I don’t quite understand you, or my English fails me, but I don’t quite see what is wrong about going against the weakest argument of the opponent. I would think in any contest the opponent’s weakest point is fair game. Furthermore, we are not in a court room, I think we both are just trying to understand something better, so I would think we should just agree with each other’s strongest argument, rather than waste time refusing to concede what we believe is actually correct in the opponent’s arguments.
DrChinese said:
I consider fair sampling to be one of the worst possible attacks as the hypothesis is born out of LR anger and frustration and little else. As I have said before, virtually every scientific experiment relies on the fair sampling assumption and there is nothing special about it with respect to a Bell test.
I don’t quite get it. Such people as Shimony and Zeilinger, who are no fans of LR, admit that the “detection loophole” (and, consequently, the fair sampling assumption) presents a serious problem (see the relevant quotes at https://www.physicsforums.com/showpost.php?p=1702189&postcount=13 and https://www.physicsforums.com/showpost.php?p=1705826&postcount=65 ). Do you really believe we should accept the fair sampling assumption without discussion? You yourself gave an example where this assumption may be less than obvious – “An example would be celestial objects used as "standard candles".” I guess the following reasoning by Santos makes some sense: “In the context of LHV theories the fair sampling assumption is, simply, absurd. In fact, the starting point of any hidden variables theory is the hypothesis that quantum mechanics is not complete, which essentially means that states which are considered identical in quantum theory may not be really identical. For instance if two atoms, whose excited states are represented by the same wave-function, decay at different times, in quantum mechanics this fact may be attributed to an ”essential indeterminacy”, meaning that identical causes (identical atoms) may produce different effects (different decay times). In contrast, the aim of introducing hidden variables would be to explain the different effects as due to the atomic states not being really identical, only our information (encapsuled in the wave-function) being the same for both atoms. That is, the essential purpose of hidden variables is to attribute differences to states which quantum mechanics may consider identical. Therefore it is absurd to use the fair sampling assumption -which rests upon the identity of all photon pairs- in the test of LHV theories, because that assumption excludes hidden variables a priori.”

DrChinese said:
On the other hand, the opposition (which is of course the mainstream) consistently challenge themselves at the highest level. For example, there are new and improved Bell tests every year.
I agree, there are “new and improved Bell tests every year”. However, so far the result is always the same: no violation of the genuine Bell inequalities. For some reason there is always something: either the detection loophole, or locality loophole, you name it. 45 years and counting – no violations. That reminds me the following words from Heller’s “Catch-22”:
"I've got just the twelve-year-old virgin you're looking for," he announced jubilantly. "This twelve-year-old virgin is really only thirty-four, but she was brought up on a low-protein diet by very strict parents and didn't start sleeping with men until"

This is the same stuff that we hear about the Bell inequalities violations (BIV): “Yeah, we demonstrated violations, they are as good as genuine ones, even better. Detection loophole? Oh, come on, you’re nit-picking. Locality loophole? Oh, come on, you’re hair-splitting”.

You believe that BIV have been demonstrated to your satisfaction? I fail to see any such demonstrations, sorry.
DrChinese said:
Entanglement is being sought - and discovered - in new and usual places. On the other hand, LRists basically deny the existence of entanglement (since they say coincidences are predetermined and not a result of an ongoing state).

So while the LR camp is grasping at straws (that's how it appears to me), I have read papers finding entanglement under every conceivable rock - including entanglement of particles that are outside of each other's light cones! And as predicted by QM.
I don’t know, I fail to see how entanglement can eliminate LR, as existence of entanglement is not enough to prove the Bell theorem. You need the projection postulate. You are a knowledgeable person, so I am sure you appreciate that “entanglement of particles that are outside of each other's light cones” per se does not eliminate LR. In general, the only thing that could be fatal to LR is genuine BIV (that is, if we forget about superdeterminism). So far genuine BIV have not been demonstrated, and I don’t hold my breath.
DrChinese said:
As to Bell using mutually contradictory assumptions: all Bell is saying is that LR predictions can never match QM. If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions. If QM is shown to be experimentally wrong tomorrow, then so be it. But the predictions of QM are still the predictions of QM, and I don't know anyone who sees any confusion (or contradiction) in the cos^2(theta) rule.
I don’t get it. I specifically indicated the two mutually contradictory assumptions that are both predictions of QM and necessary to prove the Bell theorem. So while I could agree that “If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions.”, this is not relevant, because the proof of the Bell theorem is indeed based on two mutually contradictory assumptions, and I specifically indicated that, showing where the proof uses UE and PP. As for the cos^2(theta) rule, when you use it for both particles of the singlet, I believe you need the projection postulate (to count the QM correlations), and PP directly contradicts UE.
DrChinese said:
But you are missing a truly important point of Bell: At the time it was introduced, it was widely believed that a local realistic version (a la Einstein's perspective) was tenable.
I don’t know. My impression was that the Copenhagen interpretation’s grip on physics was much stronger then than now. But I may be mistaken.
DrChinese said:
Entanglement had never been witnessed! So maybe QM was wrong. But 45 years later, the story has not played out as Einstein might have imagined.
Again, entanglement does not eliminate LR. And Einstein is no relative of mine. It is my understanding he opposed the uncertainty principle. So he was wrong on this issue (at least I believe so). But the uncertainty principle per se does not eliminate LR either. On the other hand, Einstein’s EPR paper led to significant progress.
DrChinese said:
As to Santos suggesting that QM should be relaxed: yes, I saw that paper and laughed. I mean, who is he kidding? Hey, let's change the rules so Santos can convince himself LR is viable and he is right in the end. It's science, where's the beef Santos? I would love to see Santos stick with a theory for an entire year and use it to discover new sources of entanglement that were previously unknown. That would impress me.
Neither is Santos any relative of mine:-) I just mentioned his paper as an example where a local realist appreciates that he cannot and does not need to emulate all predictions of QM.
DrChinese said:
In the meantime, there are numerous NEW theorems that are fully independent of Bell but which ALSO rule out the LR position.
Are they independent on such things as PP?
DrChinese said:
Examples are GHZ, Leggett, etc. and guess what: they don't rely on the "unfair" sampling assumption.
I don’t quite get it. Neither does the standard Bell theorem rely on the “fair” or “unfair” sampling assumption. FS is used to interpret experimental results as violating the Bell inequalities. I readily admit that I don’t know much about GHZ, Leggett etc., but I suspect they basically have the same problems as the Bell theorem. For example, I have not heard anybody state that they were successfully used to conduct loophole-free experiments eliminating LR.
DrChinese said:
So the LR position is being left in the dust as science advances. So I guess I am disagreeing with your assessment. LR is not tenable and the evidence is getting stronger, not weaker.
My assessment is there are neither no-go theorems nor experimental data eliminating LR. But I certainly respect your point of view.
 
Last edited by a moderator:
  • #65
Y'know, this is a complaint that I've never understood, because we have arrived at solid conclusions based on flimsier evidence than this. Let's examine 2 classes of the Bell-type experiments.

1. Bell-violating experiments using light.

Now, everyone agrees that the plethora of experiments (and let's be honest here, there have been PLENTY OF THEM with NOT ONE SINGLE evidence pointing to the contrary) that showed violation of Bell, etc. inequality, even for multipartite system. ALL of these experiments closed the locality loophole. However, the biggest complaint here is the detection loophole, and somehow, there are still people not convinced of the results of the experiments (even though ALL of them give the IDENTICAL conclusion).

2. Bell-violating experiments using matter.

These can be done using charge carriers, or even qubits (see, for example, M. Ansmann et al., Nature v.461, p.504 (2009)). There has been ZERO question that ALL of these experiments closed the detection loophole - you can detect them one at a time without any need for a fair-sampling treatment. The loophole that these experiment can't close right now is the locality loophole, since these are experiments done on very small scale, although there are indications that using the technique of Ansmann et al., there's a possibility that the system might be robust enough to extend to a large length scale and close this loophole as well.

So what do we have here. We have a set of test for a single principle, in which the tests are conducted in various different manner, coming from very different angles, and testing different aspects of it. It is an AMAZING FACT that ALL of them produce a consistent result! This fact seems to be severely overlooked! I mean, think about it for second! It is astounding that each of these experiments that close each of the different loopholes produce the SAME, IDENTICAL result, and not only that, the result having such HIGH CONFIDENCE (the Ansmann et al. experiment, for example, produced a result that exceeded 244 standard deviations!. It's not even funny!

I can understand if there are some indications from some experiment somewhere that a test has produced something to the contrary. The FACT that even this doesn't even exist, and yet, there are people here who are somehow CONVINCED, for some odd reason, that this whole thing is "wrong" (which is a very strong word), now THAT is utterly baffling.

Zz.
 
  • #66
ZapperZ said:
Y'know, this is a complaint that I've never understood, because we have arrived at solid conclusions based on flimsier evidence than this. Let's examine 2 classes of the Bell-type experiments.

1. Bell-violating experiments using light.

Now, everyone agrees that the plethora of experiments (and let's be honest here, there have been PLENTY OF THEM with NOT ONE SINGLE evidence pointing to the contrary) that showed violation of Bell, etc. inequality, even for multipartite system. ALL of these experiments closed the locality loophole. However, the biggest complaint here is the detection loophole, and somehow, there are still people not convinced of the results of the experiments (even though ALL of them give the IDENTICAL conclusion).

2. Bell-violating experiments using matter.

These can be done using charge carriers, or even qubits (see, for example, M. Ansmann et al., Nature v.461, p.504 (2009)). There has been ZERO question that ALL of these experiments closed the detection loophole - you can detect them one at a time without any need for a fair-sampling treatment. The loophole that these experiment can't close right now is the locality loophole, since these are experiments done on very small scale, although there are indications that using the technique of Ansmann et al., there's a possibility that the system might be robust enough to extend to a large length scale and close this loophole as well.

So what do we have here. We have a set of test for a single principle, in which the tests are conducted in various different manner, coming from very different angles, and testing different aspects of it. It is an AMAZING FACT that ALL of them produce a consistent result! This fact seems to be severely overlooked! I mean, think about it for second! It is astounding that each of these experiments that close each of the different loopholes produce the SAME, IDENTICAL result, and not only that, the result having such HIGH CONFIDENCE (the Ansmann et al. experiment, for example, produced a result that exceeded 244 standard deviations!. It's not even funny!
I am trying hard to understand how your reasoning is better than the following:
Euclidian geometry on a plane is wrong because it proves that the sum of the angles is 180 degrees. Experiment shows, however, that this is wrong 1) for quadrangles on a plane and 2) for triangles on a sphere.
Sorry, I just cannot understand how this is different from what you want me to accept. The Bell theorem states that LHV theories cannot violate some inequalities under some assumptions. All you’re telling me is experiment demonstrates violations when these assumptions are not satisfied. ZapperZ, I do sincerely respect you for knowledge and patience, so it is with great regret that I have to say that I’m less than impressed.
ZapperZ said:
I can understand if there are some indications from some experiment somewhere that a test has produced something to the contrary. The FACT that even this doesn't even exist, and yet, there are people here who are somehow CONVINCED, for some odd reason, that this whole thing is "wrong" (which is a very strong word), now THAT is utterly baffling.

Zz.
I tried to explain why unitary evolution contradicts the projection postulate. I used purely mathematical arguments. For some reason, you don’t challenge the specific reasoning I used. If you do challenge it, please advise. So on a purely mathematical level these predictions of standard quantum mechanics contradict each other. Therefore, strictly speaking, one of them wrong. Yes, this is a strong word, but I am afraid you’re trying to kill the messenger again. I did not invent unitary evolution. I did not invent the projection postulate. It’s not my fault that they contradict each other. Even if I die of West Nile fever tomorrow :-), they won’t stop contradicting each other.
 
  • #67
akhmeteli said:
I am trying hard to understand how your reasoning is better than the following:
Euclidian geometry on a plane is wrong because it proves that the sum of the angles is 180 degrees. Experiment shows, however, that this is wrong 1) for quadrangles on a plane and 2) for triangles on a sphere.
Sorry, I just cannot understand how this is different from what you want me to accept. The Bell theorem states that LHV theories cannot violate some inequalities under some assumptions. All you’re telling me is experiment demonstrates violations when these assumptions are not satisfied. ZapperZ, I do sincerely respect you for knowledge and patience, so it is with great regret that I have to say that I’m less than impressed.

I tried to explain why unitary evolution contradicts the projection postulate. I used purely mathematical arguments. For some reason, you don’t challenge the specific reasoning I used. If you do challenge it, please advise. So on a purely mathematical level these predictions of standard quantum mechanics contradict each other. Therefore, strictly speaking, one of them wrong. Yes, this is a strong word, but I am afraid you’re trying to kill the messenger again. I did not invent unitary evolution. I did not invent the projection postulate. It’s not my fault that they contradict each other. Even if I die of West Nile fever tomorrow :-), they won’t stop contradicting each other.

I was addressing your complaint regarding the loopholes, as in the detection loopholes.

If you think there is a logical inconsistencies in the Bell theorem itself, then I would love to see you stick your neck out and publish it. Complaining about it on here does no one any good, does it?

Zz.
 
  • #68
ZapperZ said:
I was addressing your complaint regarding the loopholes, as in the detection loopholes.
Yes, but you also did something else. You reproached me for the strong word “wrong”. I used this word for the assumptions of the Bell theorem only, so I assumed you challenged that part of my post as well.
ZapperZ said:
If you think there is a logical inconsistencies in the Bell theorem itself, then I would love to see you stick your neck out and publish it. Complaining about it on here does no one any good, does it?

Zz.
I am not sure I quite understand that. I don’t see what I can publish – I am not sure I said anything original. The assumptions of the Bell theorem are well-known. The problem of measurement in QM is well-known. The results of the experiments on the Bell inequalities are well-known and are not a matter of dispute – only their interpretation may be controversial. I did not present any independent research, just summarized some pretty well-known results. You don’t seem to dispute the factual aspects of my posts, only my interpretation.
As for my posts doing or not doing any good… I don’t know. I can imagine they do not do you any good, as you know everything this without me. However, we are not the only people on this forum, and I hope some of them may find my posts more useful than you do. You see, people keep saying in this forum that the Bell theorem and the relevant experiments rule out local realism. I present some arguments trying to explain that the situation is somewhat more complex. I am not sure that is just an unwanted distraction for participants of the forum. If, however, you, as a mentor, are telling me to keep my opinions to myself… Well, it’s certainly your right, you are the boss.
 
  • #69
DrChinese said:
1. You can have an unfair sample (of the universe of photon pairs), but it still must be realistic! There must be 3 simultaneous values for Alice at 0, 45, 67.5. Otherwise you are just saying it is a realistic model when it isn't. That is the point of Bell.
There are of course 3 simultaneous values for Alice at 0, 45, 67.5 - they are calculated independently for Alice and Bob. But it does not mean that all pairs are detected at 0 deg.

Let me illustrate this. We have photon pair that have the same POL value but it is off by 45 deg from polarizers of Alice and Bob. Depending from PH value photons are detected or not. But PH value for photons in pair is different (according to model) so depending from PH values of photons both of them could be detected or only one photon from pair can be detected (no coincidence) or both photons can be undetected (this case can not result in detected coincidence if we manipulate only Bob's polarizer or only Alice's polarizer).
Let's say we detected Bob's photon but didn't Alice's. Now we turn Alice's polarizer by 45 deg and sure enough now we detect Alice's photon and we have coincidence that didn't showed up at 0 deg measurement.

So you don't detect all relevant pairs (for possible 45 and 67.5 coincidences) at 0 deg according to model.
 
  • #70
ZapperZ said:
1. Bell-violating experiments using light.

Now, everyone agrees that the plethora of experiments (and let's be honest here, there have been PLENTY OF THEM with NOT ONE SINGLE evidence pointing to the contrary) that showed violation of Bell, etc. inequality, even for multipartite system. ALL of these experiments closed the locality loophole. However, the biggest complaint here is the detection loophole, and somehow, there are still people not convinced of the results of the experiments (even though ALL of them give the IDENTICAL conclusion).
Yes that thing get me puzzling about it. So I was looking what is common for all these experiments.
And you know I think I know one thing that is common for them. You have to keep coincidence detection rate as low as reasonably possible for minimum correlation settings.
That is reasonable because this is indicator how pure is entanglement. Isn't it so?
So the question is whether there can be constructed LHV models that restore local realism if quasi decoherence takes place but is taken away by unintentionally biased settings. And I just gave one such model.
 
  • #71
zonde said:
There are of course 3 simultaneous values for Alice at 0, 45, 67.5 - they are calculated independently for Alice and Bob. But it does not mean that all pairs are detected at 0 deg.

Let me illustrate this. We have photon pair that have the same POL value but it is off by 45 deg from polarizers of Alice and Bob. Depending from PH value photons are detected or not. But PH value for photons in pair is different (according to model) so depending from PH values of photons both of them could be detected or only one photon from pair can be detected (no coincidence) or both photons can be undetected (this case can not result in detected coincidence if we manipulate only Bob's polarizer or only Alice's polarizer).
Let's say we detected Bob's photon but didn't Alice's. Now we turn Alice's polarizer by 45 deg and sure enough now we detect Alice's photon and we have coincidence that didn't showed up at 0 deg measurement.

So you don't detect all relevant pairs (for possible 45 and 67.5 coincidences) at 0 deg according to model.

I am OK with you not detecting all of the relevant pairs (because you have a subset). But for subset of the ones you DO detect, you should be able to see the values for all 3 angles. That is the essence of realism.
 
  • #72
akhmeteli said:
DrChinese.
Thank you very much for a prompt a detailed reply. Let me try to comment.

1. Indeed, if your theory has two contradictory conclusions, or assumptions, that means the theory is, strictly speaking, wrong. By the way, for this very reason quantum theory, in the specific form used to prove the Bell theorem, is, strictly speaking, wrong. Mathematically wrong. It does contain two contradictory assumptions. One of these assumptions must be wrong – logic does not allow any other conclusion. Specifically, I believe that unitary evolution (UE) is right, and the projection postulate (PP) is, strictly speaking, wrong. This is just my opinion, so you may agree or disagree, but you just cannot have both UE and PP, for the simple reason that they contradict each other, and you don’t seem to dispute that. If you do, please advise. In the following I won’t repeat this caveat and will assume that it is PP that is wrong. PP may be a good approximation, it may be a very good approximation, it may be an excellent approximation, it may be an amazingly great approximation, but the bottom line is it’s just an approximation. It just cannot be precise, because if it is, then UE has its share of problems.

2. Maybe I don’t quite understand you, or my English fails me, but I don’t quite see what is wrong about going against the weakest argument of the opponent. I would think in any contest the opponent’s weakest point is fair game. Furthermore, we are not in a court room, I think we both are just trying to understand something better, so I would think we should just agree with each other’s strongest argument, rather than waste time refusing to concede what we believe is actually correct in the opponent’s arguments.

3. I don’t quite get it. Such people as Shimony and Zeilinger, who are no fans of LR, admit that the “detection loophole” (and, consequently, the fair sampling assumption) presents a serious problem (see the relevant quotes at https://www.physicsforums.com/showpost.php?p=1702189&postcount=13 and https://www.physicsforums.com/showpost.php?p=1705826&postcount=65 ). Do you really believe we should accept the fair sampling assumption without discussion? You yourself gave an example where this assumption may be less than obvious – “An example would be celestial objects used as "standard candles".” I guess the following reasoning by Santos makes some sense: “In the context of LHV theories the fair sampling assumption is, simply, absurd. In fact, the starting point of any hidden variables theory is the hypothesis that quantum mechanics is not complete, which essentially means that states which are considered identical in quantum theory may not be really identical. For instance if two atoms, whose excited states are represented by the same wave-function, decay at different times, in quantum mechanics this fact may be attributed to an ”essential indeterminacy”, meaning that identical causes (identical atoms) may produce different effects (different decay times). In contrast, the aim of introducing hidden variables would be to explain the different effects as due to the atomic states not being really identical, only our information (encapsuled in the wave-function) being the same for both atoms. That is, the essential purpose of hidden variables is to attribute differences to states which quantum mechanics may consider identical. Therefore it is absurd to use the fair sampling assumption -which rests upon the identity of all photon pairs- in the test of LHV theories, because that assumption excludes hidden variables a priori.”

4. I agree, there are “new and improved Bell tests every year”. However, so far the result is always the same: no violation of the genuine Bell inequalities. For some reason there is always something: either the detection loophole, or locality loophole, you name it. 45 years and counting – no violations. That reminds me the following words from Heller’s “Catch-22”:
"I've got just the twelve-year-old virgin you're looking for," he announced jubilantly. "This twelve-year-old virgin is really only thirty-four, but she was brought up on a low-protein diet by very strict parents and didn't start sleeping with men until"

5. I don’t know, I fail to see how entanglement can eliminate LR, as existence of entanglement is not enough to prove the Bell theorem. You need the projection postulate. You are a knowledgeable person, so I am sure you appreciate that “entanglement of particles that are outside of each other's light cones” per se does not eliminate LR. In general, the only thing that could be fatal to LR is genuine BIV (that is, if we forget about superdeterminism). So far genuine BIV have not been demonstrated, and I don’t hold my breath.

I don’t get it. I specifically indicated the two mutually contradictory assumptions that are both predictions of QM and necessary to prove the Bell theorem. So while I could agree that “If you think QM itself is based on mutually contradictory assumptions (which some claim is the case), that is NOT equivalent to saying Bell itself is based on those assumptions.”, this is not relevant, because the proof of the Bell theorem is indeed based on two mutually contradictory assumptions, and I specifically indicated that, showing where the proof uses UE and PP. As for the cos^2(theta) rule, when you use it for both particles of the singlet, I believe you need the projection postulate (to count the QM correlations), and PP directly contradicts UE.

6. I don’t know. My impression was that the Copenhagen interpretation’s grip on physics was much stronger then than now. But I may be mistaken.

7. Again, entanglement does not eliminate LR. And Einstein is no relative of mine. It is my understanding he opposed the uncertainty principle. So he was wrong on this issue (at least I believe so). But the uncertainty principle per se does not eliminate LR either. On the other hand, Einstein’s EPR paper led to significant progress.

8. I readily admit that I don’t know much about GHZ, Leggett etc., but I suspect they basically have the same problems as the Bell theorem. For example, I have not heard anybody state that they were successfully used to conduct loophole-free experiments eliminating LR.

My assessment is there are neither no-go theorems nor experimental data eliminating LR. But I certainly respect your point of view.

1. QM is not considered self contradictory, although a lot of folks don't like the collapse rules. But that is 100% irrelevant to Bell's Theorem, which merely points out that the predictions of QM and LR are different in specific areas. One has nothing to do with the other, and it is plain wrong to say "Bell is inconsistent because QM is inconsistent".

2. The answer is that it doesn't convince anyone. Which explains why the LR position is completely ignored professionally except by Santos and a few others.

3. True, they have elevated the detection loophole to a higher status. They even published a paper with Santos on the subject. For the reasons ZapperZ explained about loopholes above, I respectfully disagree with their assessment; but I understand their position as being for the sake of bringing a final and complete end to the "loopholes" discussion. I think Santos' statement you quote is ridiculous, I have seen it before and it always makes me mad. No one is a priori ignoring hidden variables. If they existed, context free, they should be noticable and yet they never are. There is absolutely NOTHING about the setups that can be said to select a subset which is biased in any way. If such bias occurs, it must be natural and subtle (like my standard candles example). The problem with that approach is that even then, there is NO known way to get the Bell results from a biased LR sample... as we see with Santos' repeated failures. And as detection efficiency improves: the Bell result simply gets stronger in complete violation of LR predictions. And finally, there is substantial independent corroboration from other experiments.

4. You are completely wrong again, the violations are there every time. The thing you ignore is called the scientific method. There is no requirement in the method - EVER - that all loopholes be closed simultaneously to accept the results of an experiment. I would say in fact that this almost NEVER occurs in any scientific experiment. The normal technique is to vary one variable at a time and chart relationships. That is why science accepts the Bell test results. If everyone stuck their heads in the ground until "perfect" experiments were done (as you seem to suggest), we would have no science at all.

5. Now you are just trying to be contradictory. You say that correlations outside of Alice and Bob's light cones are within the scope of LR? As far as I know, there has not been any attempt by a local realist to address that one. Once again, your argument circles back to "I ignore all evidence in contradiction to my viewpoint" even though this one completely contradicts every possible LR perspective.

6. The local realistic school, of which Einstein was a member, is virtually non-existent now. So you are wrong again. QM has more interpretations now, but they are all either non-local or non-realistic.

7. Of course entanglement refutes LR. That is by definition! Or more precisely, LR flatly predicts that entanglement does not exist (correlations are spurious).

8. As with Bell, the other no-gos compare the predictions of LR with the predictions of QM. They use different techniques, and they are generally not statistical. They are instead considered "all-or-nothing" and soundly support QM. I guess you will next tell us that is even more support for LR because QM is contradictory and should not be supported.

You see, your starting premise - that QM is contradictory - flies in the face of the science of the last 100 years. While you see problems, everyone else is using the theory to make new predictions and new advances. That is because QM is useful. Now, is it also true? That is not a scientific question, it is a philosophical one. QM is a model, and should not be confused with reality. See my tag line below.
 
Last edited by a moderator:
  • #73
akhmeteli said:
The problem of measurement in QM is well-known.

Apparently not as well known as you seem to think. I probably saw 10 papers last year on that subject (measurement problems), compared to perhaps 1000 on entanglement. So I would say the problem you identify is much less of a problem for the practicing physicist than you suggest.

Why don't you start a separate thread on the subject? Then we could discuss the evidence for your perspective.
 
  • #74
zonde said:
Well, it took some time
...
Probability that there is photon with given values of hidden variables:
abs(sin(2*ph))

Polarization of photon i.e. will it pass the polarizer or not (+ sign for it will pass):
sign(sin(alpha + pol)^2 - cos(ph)^2)
this function actualy determines whether polarizer angle falls in "passing" interval or in "absorbing" interval of photon so it can be described with intervals without using sine and cosine functions.

Detection (+ sign for it will be detected):
sign(cos(ph)^2-K) where K=sin(Pi/8)^2
again determines whether "ph" falls in certain interval and so can be described without cosine function
...
I have tried to incorporate your model in simulation program (basically I have used the one from de Raet mentioned earlier and made it more object oriented). Would you say the code below represents your proposal for the effect of the filter on a particle (I hope it is clear enough)?

Code:
       private void ParticleHitfromZonde(Particle Particle)
        {
            bool Pass = true;
            double HvProbability = Math.Abs(h.Sin(Particle.StaticPhaseDifference)); //Calculate HvProbability

            if (HvProbability < h.GetRandom())                                      //Get a random value between 0 and 1 
                                                                                    // and check whether HvProbability is lower
            {
                Pass=false;
            }
            if (Pass)
            {
                //user other proposed formulas:
                int WillItPass = Math.Sign(h.SinSquare(this.Angle + Particle.Polarization) - h.CosSquare(Particle.StaticPhaseDifference));
                int Detection = Math.Sign(h.CosSquare(Particle.StaticPhaseDifference) - h.SinSquare(h.PiOver8));
                if (WillItPass < 0 || Detection < 0)
                {
                    Pass = false;
                }
            }
            Particle.Absorbed = !Pass;                                              //Absorbed is opposite of pass

        }
(this.Angle is the angle of the polarization filter)
 
Last edited:
  • #75
ajw1 said:
I have tried to incorporate your model in simulation program (basically I have used the one from de Raet mentioned earlier and made it more object oriented). Would you say the code below represents your proposal for the effect of the filter on a particle (I hope it is clear enough)?

Code:
       private void ParticleHitfromZonde(Particle Particle)
        {
            bool Pass = true;
            double HvProbability = Math.Abs(h.Sin(Particle.StaticPhaseDifference)); //Calculate HvProbability

            if (HvProbability < h.GetRandom())                                      //Get a random value between 0 and 1 
                                                                                    // and check whether HvProbability is lower
            {
                Pass=false;
            }
            if (Pass)
            {
                //user other proposed formulas:
                int WillItPass = Math.Sign(h.SinSquare(this.Angle + Particle.Polarization) - h.CosSquare(Particle.StaticPhaseDifference));
                int Detection = Math.Sign(h.CosSquare(Particle.StaticPhaseDifference) - h.SinSquare(h.PiOver8));
                if (WillItPass < 0 || Detection < 0)
                {
                    Pass = false;
                }
            }
            Particle.Absorbed = !Pass;                                              //Absorbed is opposite of pass

        }
(this.Angle is the angle of the polarization filter)

A couple of questions:

1. Each particle has properties .StaticPhaseDifference and .Polarization - are there any others?
2. Also, is .Polarization randomly assigned or similar?
3. What about .StaticPhaseDifference? How is its value assigned?

I want to follow the analogy myself because I am concerned about sleight of hand that subtly puts in a non-local factor.
 
  • #76
DrChinese said:
A couple of questions:

1. Each particle has properties .StaticPhaseDifference and .Polarization - are there any others?
2. Also, is .Polarization randomly assigned or similar?
3. What about .StaticPhaseDifference? How is its value assigned?

I want to follow the analogy myself because I am concerned about sleight of hand that subtly puts in a non-local factor.

1. Currently I have no other properties for the particles except .Absorbed and .DelayTime (the delaytime is used for the de Raet model)
2 and 3. On creation, both the properties get a random value between 0 and 2pi. Then the properties of the second one is then related to that of the first one as specified by Zonde

Code:
    public class Particle
    {
        public Particle()
        {
            this.Polarization = h.GetRandomTwoPiAngle();
            this.StaticPhaseDifference = h.GetRandomTwoPiAngle();
        }


Code:
                //Initiate entangled particles
                Particle Particle1 = new Particle();
                Particle Particle2 = new Particle();

                //polarization relation
                Particle2.Polarization = Particle1.Polarization + h.PiOver2; // polarization of particle 2

                //Zonde
                Particle2.StaticPhaseDifference = Particle1.StaticPhaseDifference + h.PiOver4; // polarization of particle 2
I can attach all the classes, or, when you have access to a Visual Studio environment, the complete project
 
Last edited:
  • #77
You have Particle2.StaticPhaseDifference = Particle1.StaticPhaseDifference + h.PiOver4;

Is the piOver4 correct? Or is it supposed to be piOver2 as is polarization? That may be OK, I want to make sure though. It seems strange not to make them identical, when that is the premise.
 
  • #78
DrChinese said:
You have Particle2.StaticPhaseDifference = Particle1.StaticPhaseDifference + h.PiOver4;

Is the piOver4 correct? Or is it supposed to be piOver2 as is polarization? That may be OK, I want to make sure though. It seems strange not to make them identical, when that is the premise.
Yes, you're probably right:
zonde said:
... Here I just hypothesize that "phase" vectors are orthogonal for entangled pair...
The spreadsheet code however seems to be using PiOver4, or maybe I am missing something...
 
  • #79
I tried to follow the De Raedt example from the web site, but they hide their algorithm and dataset. I see some formulas here and there but how do you know what to do unless you can see the code? It should be very simple/straightforward - like yours - but I cannot find it. I do see a downloadable app but it is an EXE so I probably won't see how the data is generated. Oh well, I guess I will check it out.
 
  • #80
DrChinese said:
I tried to follow the De Raedt example from the web site, but they hide their algorithm and dataset. I see some formulas here and there but how do you know what to do unless you can see the code? It should be very simple/straightforward - like yours - but I cannot find it. I do see a downloadable app but it is an EXE so I probably won't see how the data is generated. Oh well, I guess I will check it out.

Their (Fortran) code is at the end of http://rugth30.phys.rug.nl/pdf/COMPHY3339.pdf"
 
Last edited by a moderator:
  • #81
ajw1 said:
Their (Fortran) code is at the end of http://rugth30.phys.rug.nl/pdf/COMPHY3339.pdf"

Excellent, thanks. This should allow me to understand what they are doing. I am working on understanding the code and should be able to provide an explanation of how it works. I should also be able to verify if realism is respected by the model, which is of course a requirement.
 
Last edited by a moderator:
  • #82
ajw1 said:
Their (Fortran) code is at the end of http://rugth30.phys.rug.nl/pdf/COMPHY3339.pdf"

Can you help me decipher this statement:

k2=ceiling(abs(1-c2*c2)**(d/2)*r0/tau) ! delay time

this looks to me like:

k2=ceiling(abs(1-(c2*c2))**((d/2)*(r0/tau))) ! delay time

and since d=2 and static reduces to:

k2=ceiling( abs(1-(c2*c2))**(r0/tau) ) ! delay time

------------------------------------------------------------------------

After examining this statement, I believe I can find an explanation of how the computer algorithm manages to produce its results. It helps to know exactly how the bias must work. :smile: The De Raedt et al model uses the time window as a method of varying which events are detected (because that is how their fair sampling algorithm works). That means, the time delay function must be - on the average - such that events at some angle settings are more likely to be included, and events at other angle setting are on average less likely to be included. It actually does not matter what physical model they propose, because eventually they must all accomplish the same thing. And that is: the bias function must account for the difference between the graphs of the QM and LR correlation functions.

Which is simply that we want the difference between the LR correlation function and the QM correlation function to be zero at 0, 45, 90, 135 degrees. That is because there is no difference in the graphs at those angles. But there is a difference at other angles. That same difference must be positive and maximum at angles like 22.5, 157.5 etc, and be negative and minimum at angles like 67.5 and 112.5 etc. (Or maybe vice versa :smile: )

So we need an embedded bias function that has those parameters, and if their computer program is to work, we will be able to find it. Once we find it, we can then assess whether it truly models the actual experimental data. If we see it does, they win. Otherwise, they lose. Of course, my job is to challenge their model. First, I must find out how they do it.

So we know that their function must: i) alternate between positive and negative bias, ii) it must have zero crossings every 45 degrees (pi/4), and iii) it must have a period of 90 degrees (pi/2). It does not need to be perfect, because the underlying data isn't going to be perfect anyway. Any of this starting to look familiar? Why yes, that is just the kind of thing we saw in zonde's model.
 
Last edited by a moderator:
  • #83
So now, per my prior post on the De Raedt model:

Let's assume I can demonstrate how the bias function uses the delay to do its work (by affecting which events are within the time window and therefore counted). The next question is: does it model all of the data of relevant Bell tests? Well, yes and no. Obviously they claim to produce QM-like data as far as was reported - YES in this regard. But most likely we will see that the traditional Bell test experimenters did not consider this clever twist - some perhaps NO in some way. It should be possible to extend the actual experiments to show whether the De Raedt model is accurate or not. In fact, I believe I can show this without performing an experiment once I run their algorithm myself.

I think I can safely give the De Raedts an A for coming up with a simulation that works as it does. As I have said previously, a simulation which produces a QM-like result is NOT the same as a local realistic theory. So such a simulation - ALONE and BY ITSELF - is NOT a disproof of the Bell Theorem. Because there are additional consequences of any local realistic theory, and if those are not considered then it cannot be a candidate. Again, this is why Santos has failed with stochastic models.
 
  • #84
DrChinese said:
Can you help me decipher this statement:

k2=ceiling(abs(1-c2*c2)**(d/2)*r0/tau) ! delay time

this looks to me like:

k2=ceiling(abs(1-(c2*c2))**((d/2)*(r0/tau))) ! delay time

and since d=2 and static reduces to:

k2=ceiling( abs(1-(c2*c2))**(r0/tau) ) ! delay time


The model produces the expected results when you use only d/2 for the power (exponent):
(1-c22) d/2
and then muliply with r0/tau (I had the same problem, so I used a fortran debugger to check the calculation. Attached a graph of my simulation. Green is when one only assumes Malus for the photons (without timetag))
 

Attachments

  • eprb.jpg
    eprb.jpg
    23.6 KB · Views: 432
Last edited:
  • #85
ajw1 said:
The model produces the expected results when you use only d/2 for the power:
(1-c22) d/2
and then muliply with r0/tau (I had the same problem, so I used a fortran debugger to check the calculation. Attached a graph of my simulation. Green is when one only assumes Malus for the photons (without timetag))

So verifying that we: DO multiply the entire result by r0/tau, and do NOT multiply the exponent d/2 by r0/tau?

Graph looks great by the way.
 
  • #86
DrChinese said:
So verifying that we: DO multiply the entire result by r0/tau, and do NOT multiply the exponent d/2 by r0/tau?
That is correct
 
  • #87
ajw1 said:
That is correct

Thanks. I should have some more soon.
 
  • #88
DrChinese said:
I am OK with you not detecting all of the relevant pairs (because you have a subset). But for subset of the ones you DO detect, you should be able to see the values for all 3 angles. That is the essence of realism.
Yes of course. What else I can say.
Detection values are calculated separately for Alice and Bob and coincidences again are calculated row by row taking only one value from Bob's data and one value from Alice's data. Final result is sum of coincidence values in all rows. All rows produce some result either 0 or some positive value.

I suppose that only satisfactory answer for you is to test the model yourself.
 
  • #89
ajw1 said:
I have tried to incorporate your model in simulation program (basically I have used the one from de Raet mentioned earlier and made it more object oriented). Would you say the code below represents your proposal for the effect of the filter on a particle (I hope it is clear enough)?

Code:
            double HvProbability = Math.Abs(h.Sin(Particle.StaticPhaseDifference));
It seems to me that there is missing *2 in that row (should be Particle.StaticPhaseDifference*2, not sure about syntax). Everything else seems ok.
 
  • #90
DrChinese said:
4. You are completely wrong again, the violations are there every time. The thing you ignore is called the scientific method. There is no requirement in the method - EVER - that all loopholes be closed simultaneously to accept the results of an experiment. I would say in fact that this almost NEVER occurs in any scientific experiment. The normal technique is to vary one variable at a time and chart relationships. That is why science accepts the Bell test results. If everyone stuck their heads in the ground until "perfect" experiments were done (as you seem to suggest), we would have no science at all.
Good argument about addressing loopholes separately. But for that to work experiments should be basically the same. That is not the case with violations of Bell inequalities.
Another good method is to vary certain parameter in question and analyze how results depend from this parameter.
So in this case it would be good to see photon experiments where detection efficiency is varied and coincidence rate (along with correlations) is analyzed. And for experiments with efficient detection distance between two entities would be the varying parameter.
But of course experiments like that would be quite challenging because of additional errors that should be taken into account when parameter in question is varied (in case of photons and even more challenging for efficient detection) so we might not see them soon if ever.
 
  • #91
zonde said:
Good argument about addressing loopholes separately. But for that to work experiments should be basically the same. ...

Once the hypothetical effect is demonstrated (not to exist), there is no requirement that the setup be identical for each effect separately. That is generally accepted science, and that is why no experiment can be said to be truly loophole free.

Now, here is the admittedly far-fetched possibility. I call it the "combination safe" analogy. We have a combination safe which has 2 (or more) digits. The analogy is that each digit is a different test loophole. Knowledge of the first digit is not enough to open the safe. Knowledge of the second digit is not enough to open the safe. You must know both (loopholes) simultaneously to open the safe and find the loot inside. This is technically possible, again for any experiment, although there are some strict requirements for the loopholes in such case. They must themselves have a relationship (i.e. they cannot be fully independent).
 
  • #92
DrChinese said:
Once the hypothetical effect is demonstrated (not to exist), there is no requirement that the setup be identical for each effect separately. That is generally accepted science, and that is why no experiment can be said to be truly loophole free.
Two setups don't have to be identical but they should be comparable so that observations in first experiment could be reasonably extended to second experiment.
So they should share significant part of setup between them.

But that is not the case with photon Bell tests and mater Bell tests. There the setups are radically different.
 
  • #93
I rewrote the algorithm without these numerous sines and cosines squared. Do not know if it's interesting.

But another thing is that thinking about physical interpretation of this model, detector efficiency does not come into play in any way - there can be fair sampling at detectors.
The core of unfair sampling comes from specific local interaction (interference) at polarizer of photon's own context wave with entangled photon's empty context wave traveling with the photon.
That seems more in line with QM.
 
  • #94
zonde said:
I rewrote the algorithm without these numerous sines and cosines squared. Do not know if it's interesting.

But another thing is that thinking about physical interpretation of this model, detector efficiency does not come into play in any way - there can be fair sampling at detectors.
The core of unfair sampling comes from specific local interaction (interference) at polarizer of photon's own context wave with entangled photon's empty context wave traveling with the photon.
That seems more in line with QM.
I'm surely interested, Maybe you can just attach a spreadsheet file with the significant lines included (all lines filled will produce probably a very large file)
 
  • #95
ajw1 said:
I'm surely interested, Maybe you can just attach a spreadsheet file with the significant lines included (all lines filled will produce probably a very large file)

I am working on a spreadsheet version using Excel.
 
  • #96
ajw1 said:
i'm surely interested, maybe you can just attach a spreadsheet file with the significant lines included (all lines filled will produce probably a very large file)

https://www.physicsforums.com/attachments/23167
I am using manual recalculation settings in excel when working with models.
Another change in this file is that uneven distribution of PH values is achieved right at generation of it's values (you will notice that arccos function is used there). And it is joined distribution for both photons because that seems to make more sense than non-matching distributions of two photons.
And PH value is directly expressed as size of the interval for angles where photon will pass polarizer.
 
Last edited:
  • #97
I have put together a model that generates the attached values when run for the range 0 to 90 degrees, incrementing by 1, and 5000 iteratations for each pair of angles. The coincidence time window is k=30 ns (scaling is by algorithm). This is a good representation of their model for Type II PDC, and follows their formula faithfully.

The purple line shows the sample, which is "close" to the QM predicted values (close being relative - keep in mind that Bell tests do not match the QM predictions perfectly either). This matches what they wanted for their model. The green line shows the full universe plot, which respects the Bell Inequality. This also matches what they wanted for their model. A few points to keep in the back of your mind as the discussion continues:

a. Because their full universe matches the LR boundary condition (so as to obey Bell), it obviously does NOT respect Malus. You can see that on the chart. So that is a nasty little issue to deal with. That is one of the reasons that folks say that no LR theory can agree with ALL of the predictions of QM. I think it has been long realized that this would be a result of any algorithm that could address the entanglement side of things.

b. Also, while it appears from the attached chart that Bell's Inequality is not violated for the full sample... that too is somewhat misleading. My spreadsheet documents the event by event portion in an explicitly realistic fashion. It accomplished this by displaying the results of every iteration for any trial you want to run. It then models what happens if you could test particle 2 at an extra angle setting, 45 degrees offset to the main setting for particle 1. So such simulation shows a total of 3 measurements. Only 2 are physically possible in an actual experiment, but in the computer program 3 are possible while respecting the model. Because the LR boundary condition only works when there are NO events of a certain type, the presence of those events could mean that Bell's Inequality is violated after all. I will have a picture of this shortly in case the reasoning is not clear from my verbage.

c. I should soon have a diagram showing my original objection to their model describing at the beginning of this thread. That being that their model does not handle photon pairs that are not polarization entangled, although they explicitly claim it does. That cannot be seen from this chart.
 

Attachments

  • DeRaedt.UnfairSampling.TypeIIEntanglement1.jpg
    DeRaedt.UnfairSampling.TypeIIEntanglement1.jpg
    33.7 KB · Views: 435
Last edited:
  • #98
DrChinese said:
The purple line shows the sample, which is "close" to the QM predicted values (close being relative - keep in mind that Bell tests do not match the QM predictions perfectly either). This matches what they wanted for their model. The green line shows the full universe plot, which respects the Bell Inequality. This also matches what they wanted for their model.
Result does not seem very good. I think it should fluctuate around QM prediction but it is constantly closer to straight line. Isn't it so?

DrChinese said:
A few points to keep in the back of your mind as the discussion continues:

a. Because their full universe matches the LR boundary condition (so as to obey Bell), it obviously does NOT respect Malus. You can see that on the chart. So that is a nasty little issue to deal with. That is one of the reasons that folks say that no LR theory can agree with ALL of the predictions of QM. I think it has been long realized that this would be a result of any algorithm that could address the entanglement side of things.
This can not be seen from graph because reference in the graph is relative polarization angle between two photons and not polarization of individual photons of one side relative to polarizer. The model is silent about that so it can not be judged by that.

DrChinese said:
b. Also, while it appears from the attached chart that Bell's Inequality is not violated for the full sample... that too is somewhat misleading. My spreadsheet documents the event by event portion in an explicitly realistic fashion. It accomplished this by displaying the results of every iteration for any trial you want to run. It then models what happens if you could test particle 2 at an extra angle setting, 45 degrees offset to the main setting for particle 1. So such simulation shows a total of 3 measurements. Only 2 are physically possible in an actual experiment, but in the computer program 3 are possible while respecting the model. Because the LR boundary condition only works when there are NO events of a certain type, the presence of those events could mean that Bell's Inequality is violated after all. I will have a picture of this shortly in case the reasoning is not clear from my verbage.
Picture might help. But from what I understood there is nothing wrong with LR model if it can demonstrate different angle settings for one side while keeping the other side intact. That just makes the point about element of reality present.
 
  • #99
zonde said:
1. Result does not seem very good. I think it should fluctuate around QM prediction but it is constantly closer to straight line. Isn't it so?

2. This can not be seen from graph because reference in the graph is relative polarization angle between two photons and not polarization of individual photons of one side relative to polarizer. The model is silent about that so it can not be judged by that.


3. Picture might help. But from what I understood there is nothing wrong with LR model if it can demonstrate different angle settings for one side while keeping the other side intact. That just makes the point about element of reality present.

1. It's not too bad. Ideally they would have something closer to the QM value. Because they achieve the result by the introduction of a random fluctuation, the amount is about halfway between.

You don't notice the issue on their graphs because they sample only at a few pairs of angle settings. My simulation fills in the gaps by running across 90 degrees by degree. To be fair, I do not consider their presentation in this regard misleading.

2. I don't agree.

3. After finishing the model last night, I checked this element out. It turns out the "suppressed cases" (2 of 8 permutations) worked out fine in their model, so as to not cause an issue.
 
  • #100
OK, I am attaching the XLSM file of my recreation of the De Raedt model to the other thread discussing the model explicitly. If it does not come across, send me a message with your email and I will send it to you directly. Anyone is welcome to look at the results. :smile:
 
Back
Top