AndreiB said:
Either my explanation is true or it is not. I think the word "excuse" here is used to avoid accepting that my explanation is perfectly valid.
Your "explanations" contain two elements which contradict each other. On the one hand, you use typical common sense reasoning what some interactions are not strong enough to have an influence, on the other hand you use superdeterminism where even the smallest imaginable modification would destroy the whole construction by destroying the correlation. One would be valid in a normal world, the other one in that superdeterministic world. Using both together would be inconsistent.
AndreiB said:
1. The model I proposed is in term of classical EM. The Big-Bang, inflation period and all that cannot be described in terms of this model. So, let's stay in a regime where this model makes sense.
Photons flowing from far away toward the devices, and detectors of such photons which would turn the spin measurement detectors as necessary, can be described by classical EM (a photon can be simply described by a particular classical solution fulfilling the quantization condition). So, start the computation a second before the initialization, the measurement done a second after the initialization, with those photons possibly modifying the detectors angles being 1.9 light seconds away from their target detectors if they exist.
AndreiB said:
2. I agree that if you can prove that "the event which has created these photons has not been in the past light cone of the preparation of the pair" SD is dead. The question is, can you?
In GR without inflation it is well-known and trivial, in GR with inflation one would have to look back more, finding earlier causal events coming from even more far away. Or, similar to the classical EM picture above, we can start with the initial values not at the singularity but later, after these photons have been emitted. Last but not least, if you don't allow for the computation starting at some quite arbitrary time, your computation fails even in your theory even with almighty computer power.
AndreiB said:
As far as I know there is no theory at this time that is capable of describing the Big-Bang. So all this is pure speculation.
The Big Bang meaning the hot early phase of the universe where the average density was, say, similar to a neutron star is understood quite well in standard cosmology. The singularity itself only shows that GR becomes invalid if the density will be too large. But for the discussion of superdeterminism this is irrelevant anyway. The point I have made is that in your construction will be anyway initial values which can have a causal influence on each of the detectors but not on the pair preparation and the corresponding other detector. If you accept this, no need for BB theory.
AndreiB said:
I don't think so. "Normal science" uses a certain model. The conclusions only apply if the model is apt for the experiment under investigation. For example, the kinetic theory of gases apply for an ideal gas. It works when the system is well approximated by such a model. If your gas is far from ideal you don't stubbornly insist on this model, you change it. In the case of Bell's theorem the model is Newtonian mechanics with contact forces only. Such a model is inappropriate for describing EM phenomena, even classical EM phenomena like induction. So, it is no wonder that the model fails to reproduce QM.
Sorry, but this is nonsense. Bell's theorem presupposes only EPR realism (which does not even mention Newton or contact forces) and Einstein causality. Classical EM fits into the theorem, thus, cannot violate the BI. GR too. Realistic quantum interpretations like dBB fulfill EPR realism, but violate Einstein causality (but not classical causality).
Other variants of the proof rely only on causality, and what they need is the common cause principle. That's all.
AndreiB said:
"Sufficiently simple" is a loaded term.
Once you think about computations with ##10^{26}## particles, let's take much less, say, ##10^9## particles. And let's name an explanation sufficiently simple if it can be shown using computations with less than ##10^9## particles. That would be fair enough, not?
AndreiB said:
And i didn't claim that the statistics does not remain stable in this case, I just don't know.
If you are right and the function behaves well, great. We will be able to compute the classical EM prediction for a Bell test. When such computation is done we will see if it turns out right or wrong.
AndreiB said:
I don't get the your point about PI. Clearly, the digits of PI are not independent since they are determined by a quite simple algorithm. Two machines calculating PI would be perfectly correlated.
But if you have two quite correlated sequences of digits, and add the sequence of the digits of ##\pi# mod 10 to one but not the other, the correlation disappears.
AndreiB said:
IF Bell correlations are caused by long-range interactions between the experimental parts one should be able to prepare macroscopic entangled states. I am not aware of any attempt of doing so.
I see no base for this. Entanglement of macroscopic states is destroyed by any interaction with the enviroment, this is called decoherence.
AndreiB said:
Clearly, all theories with long-range interactions are falsifiable. Classical EM, GR, fluid mechanics have been tested a lot. I have laid out my argument why such theories could be superdeterministic. We will know that when the function is computed. Until then you cannot rule them out.
No. There is a level of insanity of such philosophical theories where it becomes impossible in principle to rule them out. You cannot rule out, say, solipcism. Superdeterminism is in the same category. It is impossible to rule it out. Your computations would be simple computations of normal theories, without any connection to superdeterminism beyond your words. The only thing one can do with superdeterminism it to recognize that it makes no sense and to ignore it.
AndreiB said:
But there is a causal justification. There is a long-range interaction involved that determines the hidden variable. "Normal" science does not assume independence when this is the case.
False. Common sense makes a large difference between some accidental influences and systematic influences which can explain stable correlations.
AndreiB said:
As far as i can say, the independence assumption was falsified by Bell tests + EPR argument. Locality can only be maintained if the independence assumption fails. And no violation of locality was ever witnessed in "normal science", right?
Completely wrong. The independence assumption is much more fundamental than experimental upper bounds found up to now for causal influences. You cannot falsify GR by observations on soccer fields. If you evaluate what you see on a soccer field, you will not even start to question GR. Same here. You will not even start questioning the the independence assumption or the common cause principle because of some observations on the quantum field. But you use them to find out what the experiment tells us. And it tells us that Einstein causality is violated.
AndreiB said:
There is nothing wrong with my logic. If you can't prove a violation (and you can't) you cannot just assume one.
Thanks for making my point. Except that you have to replace your "you" with "I". You just assume a violation of the common cause principle.
AndreiB said:
I think you forget the really important point that without SD you have non-locality. Your arguments based on what "normal science" assumes or not do not work for this scenario, since, when you factor in the strong evidence for locality, the initial probability for a violation of the statistical independence is increased many orders of magnitude.
As explained, non-locality is not really an issue, science has developed nicely during the time of non-local Newtonian gravity. Moreover, quantum non-locality is unproblematic because it does not appear at all without some special preparation procedures.
Instead, with superdeterminism we can no longer make any statistical science.