Photon entanglement and fair sampling assumption

Click For Summary
The discussion centers on the validity of the fair sampling assumption in photon entanglement experiments, which is crucial for applying Bell's inequalities. It argues that since only a portion of emitted photons is detected, the assumption that the detected sample is representative may not hold, raising doubts about claims of nonlocality and Bell's inequality violations. Three proposed experiments aim to challenge this assumption: one examines correlations in three-photon entanglement, another tests the superposition of wavefunctions post-polarization, and the third investigates how detection efficiency impacts coincidence rates. The conversation highlights the need for rigorous testing against systematic errors and questions why such discussions are infrequent in the field. Overall, the validity of the fair sampling assumption remains a contentious issue that warrants further exploration.
  • #91
zonde said:
Good argument about addressing loopholes separately. But for that to work experiments should be basically the same. ...

Once the hypothetical effect is demonstrated (not to exist), there is no requirement that the setup be identical for each effect separately. That is generally accepted science, and that is why no experiment can be said to be truly loophole free.

Now, here is the admittedly far-fetched possibility. I call it the "combination safe" analogy. We have a combination safe which has 2 (or more) digits. The analogy is that each digit is a different test loophole. Knowledge of the first digit is not enough to open the safe. Knowledge of the second digit is not enough to open the safe. You must know both (loopholes) simultaneously to open the safe and find the loot inside. This is technically possible, again for any experiment, although there are some strict requirements for the loopholes in such case. They must themselves have a relationship (i.e. they cannot be fully independent).
 
Physics news on Phys.org
  • #92
DrChinese said:
Once the hypothetical effect is demonstrated (not to exist), there is no requirement that the setup be identical for each effect separately. That is generally accepted science, and that is why no experiment can be said to be truly loophole free.
Two setups don't have to be identical but they should be comparable so that observations in first experiment could be reasonably extended to second experiment.
So they should share significant part of setup between them.

But that is not the case with photon Bell tests and mater Bell tests. There the setups are radically different.
 
  • #93
I rewrote the algorithm without these numerous sines and cosines squared. Do not know if it's interesting.

But another thing is that thinking about physical interpretation of this model, detector efficiency does not come into play in any way - there can be fair sampling at detectors.
The core of unfair sampling comes from specific local interaction (interference) at polarizer of photon's own context wave with entangled photon's empty context wave traveling with the photon.
That seems more in line with QM.
 
  • #94
zonde said:
I rewrote the algorithm without these numerous sines and cosines squared. Do not know if it's interesting.

But another thing is that thinking about physical interpretation of this model, detector efficiency does not come into play in any way - there can be fair sampling at detectors.
The core of unfair sampling comes from specific local interaction (interference) at polarizer of photon's own context wave with entangled photon's empty context wave traveling with the photon.
That seems more in line with QM.
I'm surely interested, Maybe you can just attach a spreadsheet file with the significant lines included (all lines filled will produce probably a very large file)
 
  • #95
ajw1 said:
I'm surely interested, Maybe you can just attach a spreadsheet file with the significant lines included (all lines filled will produce probably a very large file)

I am working on a spreadsheet version using Excel.
 
  • #96
ajw1 said:
i'm surely interested, maybe you can just attach a spreadsheet file with the significant lines included (all lines filled will produce probably a very large file)

https://www.physicsforums.com/attachments/23167
I am using manual recalculation settings in excel when working with models.
Another change in this file is that uneven distribution of PH values is achieved right at generation of it's values (you will notice that arccos function is used there). And it is joined distribution for both photons because that seems to make more sense than non-matching distributions of two photons.
And PH value is directly expressed as size of the interval for angles where photon will pass polarizer.
 
Last edited:
  • #97
I have put together a model that generates the attached values when run for the range 0 to 90 degrees, incrementing by 1, and 5000 iteratations for each pair of angles. The coincidence time window is k=30 ns (scaling is by algorithm). This is a good representation of their model for Type II PDC, and follows their formula faithfully.

The purple line shows the sample, which is "close" to the QM predicted values (close being relative - keep in mind that Bell tests do not match the QM predictions perfectly either). This matches what they wanted for their model. The green line shows the full universe plot, which respects the Bell Inequality. This also matches what they wanted for their model. A few points to keep in the back of your mind as the discussion continues:

a. Because their full universe matches the LR boundary condition (so as to obey Bell), it obviously does NOT respect Malus. You can see that on the chart. So that is a nasty little issue to deal with. That is one of the reasons that folks say that no LR theory can agree with ALL of the predictions of QM. I think it has been long realized that this would be a result of any algorithm that could address the entanglement side of things.

b. Also, while it appears from the attached chart that Bell's Inequality is not violated for the full sample... that too is somewhat misleading. My spreadsheet documents the event by event portion in an explicitly realistic fashion. It accomplished this by displaying the results of every iteration for any trial you want to run. It then models what happens if you could test particle 2 at an extra angle setting, 45 degrees offset to the main setting for particle 1. So such simulation shows a total of 3 measurements. Only 2 are physically possible in an actual experiment, but in the computer program 3 are possible while respecting the model. Because the LR boundary condition only works when there are NO events of a certain type, the presence of those events could mean that Bell's Inequality is violated after all. I will have a picture of this shortly in case the reasoning is not clear from my verbage.

c. I should soon have a diagram showing my original objection to their model describing at the beginning of this thread. That being that their model does not handle photon pairs that are not polarization entangled, although they explicitly claim it does. That cannot be seen from this chart.
 

Attachments

  • DeRaedt.UnfairSampling.TypeIIEntanglement1.jpg
    DeRaedt.UnfairSampling.TypeIIEntanglement1.jpg
    33.7 KB · Views: 441
Last edited:
  • #98
DrChinese said:
The purple line shows the sample, which is "close" to the QM predicted values (close being relative - keep in mind that Bell tests do not match the QM predictions perfectly either). This matches what they wanted for their model. The green line shows the full universe plot, which respects the Bell Inequality. This also matches what they wanted for their model.
Result does not seem very good. I think it should fluctuate around QM prediction but it is constantly closer to straight line. Isn't it so?

DrChinese said:
A few points to keep in the back of your mind as the discussion continues:

a. Because their full universe matches the LR boundary condition (so as to obey Bell), it obviously does NOT respect Malus. You can see that on the chart. So that is a nasty little issue to deal with. That is one of the reasons that folks say that no LR theory can agree with ALL of the predictions of QM. I think it has been long realized that this would be a result of any algorithm that could address the entanglement side of things.
This can not be seen from graph because reference in the graph is relative polarization angle between two photons and not polarization of individual photons of one side relative to polarizer. The model is silent about that so it can not be judged by that.

DrChinese said:
b. Also, while it appears from the attached chart that Bell's Inequality is not violated for the full sample... that too is somewhat misleading. My spreadsheet documents the event by event portion in an explicitly realistic fashion. It accomplished this by displaying the results of every iteration for any trial you want to run. It then models what happens if you could test particle 2 at an extra angle setting, 45 degrees offset to the main setting for particle 1. So such simulation shows a total of 3 measurements. Only 2 are physically possible in an actual experiment, but in the computer program 3 are possible while respecting the model. Because the LR boundary condition only works when there are NO events of a certain type, the presence of those events could mean that Bell's Inequality is violated after all. I will have a picture of this shortly in case the reasoning is not clear from my verbage.
Picture might help. But from what I understood there is nothing wrong with LR model if it can demonstrate different angle settings for one side while keeping the other side intact. That just makes the point about element of reality present.
 
  • #99
zonde said:
1. Result does not seem very good. I think it should fluctuate around QM prediction but it is constantly closer to straight line. Isn't it so?

2. This can not be seen from graph because reference in the graph is relative polarization angle between two photons and not polarization of individual photons of one side relative to polarizer. The model is silent about that so it can not be judged by that.


3. Picture might help. But from what I understood there is nothing wrong with LR model if it can demonstrate different angle settings for one side while keeping the other side intact. That just makes the point about element of reality present.

1. It's not too bad. Ideally they would have something closer to the QM value. Because they achieve the result by the introduction of a random fluctuation, the amount is about halfway between.

You don't notice the issue on their graphs because they sample only at a few pairs of angle settings. My simulation fills in the gaps by running across 90 degrees by degree. To be fair, I do not consider their presentation in this regard misleading.

2. I don't agree.

3. After finishing the model last night, I checked this element out. It turns out the "suppressed cases" (2 of 8 permutations) worked out fine in their model, so as to not cause an issue.
 
  • #100
OK, I am attaching the XLSM file of my recreation of the De Raedt model to the other thread discussing the model explicitly. If it does not come across, send me a message with your email and I will send it to you directly. Anyone is welcome to look at the results. :smile:
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
Replies
41
Views
6K
Replies
14
Views
2K
Replies
1
Views
2K
  • · Replies 61 ·
3
Replies
61
Views
4K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
15
Views
3K
Replies
58
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K