# Superdeterminism and the hidden variable

Summary:
Bell's superdetermimism seems to imply that there is a hidden variable.
Bell's superdetermimism seems to imply that there is a hidden variable. Please correct me if I am wrong but that seems like classical physics.

bobob
Gold Member
Superdeterminism has no physical content.

Nugatory
Mentor
Bell's superdetermimism....
Bell's? When he did embrace superdeterminism?
.... that seems like classical physics.
Of course - that's why it was invented. Superdeterminism is one way (the only way when the other "loopholes" are closing) of reconciling classical physics with the experimentally observed violations of Bell's inequality.

Superdeterminism has a number of problems: it is not experimentally testable (as @bobob says, "no physical content") and the mechanisms it requires seem wildly implausible to most people. It is pretty much a matter of personal taste whether retaining the classical worldview is worth accepting these problems.

You wrote," and the mechanisms it requires seem wildly implausible to most people "
What are those wildly implausible requirements? Superdeterminism seems reasonable .OK it is not experimentally testable. It is not experimentally possible to determine if outside the light cone are unicorns. But I think it is reasonable to say that there are no unicorns beyond the observable universe. Do I absolutely know that there are none? NO! Science is not based on absolute certainty. Can I say that the earth goes around the sun? I am "ONLY " 99.99999 % sure.

swampwiz

I love it when he says that when students ask how can such a weird thing can be true, the answer is to give up physics. He also says that modern physicists are like a guy that tells you how to get the right results from your calculator , but your question as to how the calculator works is forbidden! I want to know about reality, not how to push the right buttons to get the answer to a math problem.

Nugatory
Mentor
What are those wildly implausible requirements?
Suppose I design a clever automated device with a polarizing filter and a chamber into which we can insert a billet of uranium; the device sets its orientation for each measurement according to the pattern of random radioactive decay in that uranium billet. I make two copies my design blueprints; one goes into storage on earth and the other goes into something like the Voyager spacecraft. A few tens of millennia later the spacecraft reaches an inhabited planet, and these alien physicists build the machine according to the blueprint I sent them, including locating an ore deposit and mining and refining some uranium. Meanwhile my remote descendants are doing the same thing with the blueprints left back on earth. After a decade or so exchanging radio messages to confirm that both sides have set up their devices, some entangled photon pairs are generated and sent to both detectors (another few years) and then the results are shared by radio (even more years).... and it is seen that Bell’s inequality has been violated.

The superdeterminist explanation is that there is a relationship between the decay patterns of two ostensibly independent pieces of uranium mined and refined on different planets light-years apart and the BBO crystal we’re using to generate our entangled photon pairs. It’s possible - all three deterministically evolved from the same cloud of intergalactic schmutz a few billion years ago - but I feel justified in applying adjectives like “extraordinary” and “implausible” to that possibility.

Bobob wrote," Superdeterminism has no physical content. "
I dont know what that means. Are you saying that there is no experiment that verifies superdeterminism? OK, So the theory that there are not 8,000,000 unicorns just beyond the observable universe has no physical content? If a theory is reasonable such as superdeterminism and doesn't require abandoning logic ( Copenhagen violates the law of the excluded middle ) I am more likely to accept it.

Nugatory, I wrote that before your response. I have to go to my fiancée's house. I will print it and ponder it.

Thank you! I love pondering!

Bell's? When he did embrace superdeterminism?Of course - that's why it was invented. Superdeterminism is one way (the only way when the other "loopholes" are closing) of reconciling classical physics with the experimentally observed violations of Bell's inequality.

No. Classical physics (or, in this case, better metaphysics) does not have any problem with violations of Bell's inequality. The straightforward classical way is to accept a preferred frame - as in the classical interpretation of special relativity following Lorentz. Non-locality is not a big problem for classical physics, given that Newtonian gravity was non-local too. It is relativity (in its fundamental interpretation) which has a problem with violations of Bell's inequality.

atyy
Summary:: Bell's superdetermimism seems to imply that there is a hidden variable.

Bell's superdetermimism seems to imply that there is a hidden variable. Please correct me if I am wrong but that seems like classical physics.

Yes, superdeterminism is a sort of hidden variable theory. Unlike a hidden variable theory like Bohmian Mechanics, superdeterministic hidden variables can be local. However, it is difficult (impossible?) for us to come up with a useful hidden variable theory, since it is likely to depend on fine tuning of many details in the past.

mattt and Demystifier
All time/CPT symmetric formulations (with boundary conditions in past and future), like the least action principle, or Feynman ensembles of paths/diagrams/histories of the universe are superdeterministic.

For example scattering matrix uses boundary conditions in past and future ( https://en.wikipedia.org/wiki/S-matrix#Interaction_picture ):
##S_{fi} = \lim_{t_2\to +\infty} \lim_{t_1\to-\infty} \langle \Phi_f|U(t_2,t_1)|\Phi_i\rangle ##

Demystifier
Demystifier
Gold Member
All time/CPT symmetric formulations (with boundary conditions in past and future), like the least action principle, or Feynman ensembles of paths/diagrams/histories of the universe are superdeterministic.

For example scattering matrix uses boundary conditions in past and future ( https://en.wikipedia.org/wiki/S-matrix#Interaction_picture ):
##S_{fi} = \lim_{t_2\to +\infty} \lim_{t_1\to-\infty} \langle \Phi_f|U(t_2,t_1)|\Phi_i\rangle ##
That has nothing to do with superdeterminism.

The best person for this topic is probably Ken Wharton, claiming that superdeterministic explanations (now regaining popularity due to Sabine Hossenfelder) are equivalent with time symmetric ones - we have some Ken's papers and slides in our seminar webpage: http://th.if.uj.edu.pl/~dudaj/QMFNoT

Let me elaborate how I see it.
We usually think about determinism as evolution from past boundary conditions, like in Euler-Lagrange or Schrodinger equation.
But in time/CPT symmetric formulations, e.g. the least action principle, or Feynman path integrals, we derive them based on boundary conditions in both past and future to mount such trajectories.
The difference is that e.g. the least action principle requires only values at both boundaries, while Euler-Lagrange requires values and derivatives in one boundary.

In superdeterminism we imagine that state is already chosen to be compatible with all future measurements.
In symmetric formulations it is not a surprise as everything depends on both past and future, we can imagine that state contains this additional future-dependent information e.g. in these additional derivatives in Euler-Lagrange.

Nugatory
Mentor
Closed while the moderators fetch a mop and a bucket.

... which has now been employed to clean up a long digression from the original thread topic. The thread is open again.

Last edited:
mattt, berkeman and phinds
RUTA
Here are some things on superdeterminism you might find interesting:

Sabine Hossenfelder

Gerard ‘tHooft

Gerard ‘tHooft

’t Hooft is upset by the fact that QM only gives us averages, it does not tell us what each particle is doing, i.e., it doesn’t tell us “what reality is.” But, as I pointed out in this ScienceX News article regarding Bell states, that’s because there is no hidden reality. Reality is not fundamentally governed by causal mechanisms, but by principles.

Lynch101 and DrClaude
But, as I pointed out in this ScienceX News article regarding Bell states, that’s because there is no hidden reality. Reality is not fundamentally governed by causal mechanisms, but by principles.
So you essentially propose to give up causality as a guiding principle. Without anything in exchange. Ok, here you will probably object, given that you propose to use the "principle fashion" instead. But, sorry, nothing prevents the use of principle explanations together with causal explanations. There is no either or between principle explanations and causal explanations. So, the principle explanations are not at all a replacement, a gain we obtain by throwing away causal explanations. And so your proposal means to give up causal explanations without gaining anything.

But, of course, it may happen that we are forced to give up causal explanations. How can we find out if we are in such a situation? The answer is straightforward and simple: There will be something which does not allow any causal explanation. Imaginary examples of such falsifications of causality can be found already in many myths, like the story of Oedipus, and, in fact, almost every story where somebody learns about a prophecy not in his favor and tries everything imaginable to prevent this, without success. Of course, behind those myths are plausibly real experiences of children faced with predictions of adults what would be the long-term consequences of their stupid behavior, predictions which have causal explanations, but the children don't know them. Similarly, if we see such an empirical "falsification" of causality, we should be aware of the possibility that nonetheless there exist a causal explanation which is yet unknown to us.

So, before rejecting causality, there should be, first, some experiment which superficially falsifies causality, that means that we actually don't have any causal explanation. But then there should be even more: Sufficiently strong evidence that no causal explanation is possible even in principle.

What is the situation for Bell's theorem? It is far away from this. We have causal explanations for the observed violations of the Bell inequalities, namely all the realist causal interpretations of QT (de Broglie-Bohm, Nelsonian stochastics, Caticha's entropic dynamics). The only objection against them is metaphysical: They require a (hidden) preferred frame in the relativistic context.

So, relativistic symmetry is emergent, not fundamental. Big deal. We know anyway that the actual field theories we have are emergent, effective theories (mainly because gravity is non-renormalizable) , so the hypothesis that its symmetries are not emergent but fundamental is pure speculation based on nothing. Then, we have everyday examples of emergent Lorentz symmetry - every wave equation ##(\partial_t^2 - c_m^2 \Delta +V(u)) u(\vec{x},t) = 0##, with ##c_m## being a constant characteristic for this sort of waves, has a Lorentz symmetry with that ##c_m## instead of c: It allows to compute a Doppler-shifted solution for every solution of that wave equation. Of course, for sound waves we know that this symmetry is only a large scale approximation, thus, emergent.

Are there other objections against the realist causal interpretations? No.

What is the situation for Bell's theorem? It is far away from this. We have causal explanations for the observed violations of the Bell inequalities, namely all the realist causal interpretations of QT (de Broglie-Bohm, Nelsonian stochastics, Caticha's)
Could you elaborate?

I prefer Mermin's inequality as more obvious than Bell's, hence more difficult to violate:
For ABC binary variables: Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1
intuitively: "tossing 3 coins, at least 2 give the same value" - looks completely absolutely obvious, but QM formalism allows to violate it (e.g. https://arxiv.org/pdf/1212.5214 ).

Derivation of this inequality doesn't use locality assumption, only existence of joint probability distribution and Kolmogorov axioms ( https://en.wikipedia.org/wiki/Probability_axioms ) - could you briefly say how these interpretations can explain its violation?

RUTA
The point of the addendum to my post about principle explanation is in response to ’t Hooft’s complaint that QM only gives averages. He thinks that fact entails that something is missing from QM. Our paper shows that average-only conservation is necessarily the best the conservation principle can be instantiated, if the QM correlations are correct (and they have been verified to 8 sigma).

Keep in mind that principle explanation is simply the mathematical consequence of some empirical fact. So, whatever constructive counterpart you want to propose will have to be in accord with the principle explanation unless you are refuting the mathematical consequence, which in this case has been tested to 8 sigma.

Whatever causal mechanism you want to propose that leads to average-only conservation is going to be very deviant from what is typically meant by a causal mechanism, e.g., "Principle theories, constructive theories, and explanation in modern physics." Wesley Van Camp. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 42(1), 23-31 (2011):
The interpretive work that must be done is less in coming up with a constructive theory and thereby explaining puzzling quantum phenomena, but more in explaining why the interpretation counts as explanatory at all given that it must sacrifice some key aspect of the traditional understanding of causal-mechanical explanation.
So, the search for a constructive account of Bell state entanglement is starting to sound a lot like the aether as a causal mechanism for SR.

The point is, we have one and the same principle explanation for the mysteries of SR and QM while both are still without consensus constructive counterparts after 115 and 85 years, respectively. This is my response to ‘t Hooft’s motivation for superdeterminism.

Last edited:
Derivation of this inequality doesn't use locality assumption, only existence of joint probability distribution and Kolmogorov axioms ( https://en.wikipedia.org/wiki/Probability_axioms ) - could you briefly say how these interpretations can explain its violation?
Your link refers to some variant of the theorem which starts with the assumption of counterfactual existence. This is not less, but more than what Bell uses because the first part of the proof (in the paper itself a first few lines with a verbal reference to EPR before the first formula, ignored by many readers, only later he has recognized this and emphasized this part in more detail as a very essential part) is the proof that from Einstein causality and the EPR criterion of locality it counterfactual existence follows.

Because Einstein causality does not hold in the realist interpretations, one cannot use it to prove counterfactual existence. And in itself there is no counterfactual existence: The measurement device of a general measurement has also a trajectory ##q_{dev}(t)\in Q_{dev}##, and the "measurement result" depends for the general measurement not only on the trajectory of the system, but also on the trajectory of the measurement device (and in particular on the initial value). The only exception would be the measurement of the configuration itself, so, whenever some non-configuration measurement is involved, the measurement result depends on the measurement device too. So let's follow Bell's recommendation in "against measurement" and rename "measurement of A" with something like "result of interaction with device A" and so on.

So only the configuration variables have counterfactual existence. Everything else not, all other "measurements" are contextual. (This includes also measurements of other maximal sets of commuting variables, even if they contain some configuration variables, the "measurement" even of those configuration variables can, then, be influenced by the other context.)

Derivation of Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1 inequality uses just two assumptions:
1) There exists joint probability distribution Pr(ABC) ##(\sum_{ABC} Pr(ABC) =1)##,
2) Kolmogorov axioms, especially the 3rd one: https://en.wikipedia.org/wiki/Probability_axioms#Third_axiom

Derivation: from 3rd axiom
Pr(A=B) = Pr(000) + Pr(001) + Pr(110) + Pr(111)
Pr(A=C) = Pr(000) + Pr(010) + Pr(101) + Pr(111)
Pr(B=C) = Pr(000) + Pr(100) + Pr(011) + Pr(111)
getting
Pr(A=B) + Pr(A=C) + Pr(B=C) = 2Pr(000) + 2Pr(111) + sum_{ABC} Pr(ABC) >= 1
But QM formalism allows to violate it (e.g. 4.2.1 in Preskill notes http://theory.caltech.edu/~preskill/ph229/notes/chap4_01.pdf ).

So at least one of the two above assumptions is not satisfied by physics - which one?

I would say that QM allows to replace 3rd axiom with Born rule:
- 3rd axiom: probability of alternative of disjoint events is sum of their probabilities,
- Born rule: probability of alternative of disjoint events is proportional to square of sum of their amplitudes.
The problem is understanding Born rule, but using Feynman -> Boltzmann path ensemble mathematics similarity, we can get Born-like rule in Ising model, what brings nice intuitions: https://www.physicsforums.com/threa...sing-model-feynman-boltzmann-ensemble.995234/

1) There exists joint probability distribution Pr(ABC) ##(\sum_{ABC} Pr(ABC) =1)##,
2) Kolmogorov axioms,
So at least one of the two above assumptions is not satisfied by physics - which one?
The first one. It is assuming that the "measurement" results are predefined, that measuring A for the first copy does not change A, B or C for the second one. In the Bell tests, one hopes to reach this by making both measurements spacelike separated, so that if Einstein causality holds, both measurements cannot causally influence the other measurements or their results. But in the realistic interpretations, such influence explicitly exists. If Alice measures A, and gains 1, then an immediate causal influence guarantees that after this Bob measuring A will give 1 too.

Keep in mind that principle explanation is simply the mathematical consequence of some empirical fact. So, whatever constructive counterpart you want to propose will have to be in accord with the principle explanation unless you are refuting the mathematical consequence, which in this case has been tested to 8 sigma.
Ok. Even if there is yet a difference between a principle and the empirical facts supporting that principle, such principles are indeed quite close to the empirical facts, and, therefore, unable to explain them. A "principle" based on observational facts cannot be an explanation of these observational facts.

The number of sigma is, in fact, quite irrelevant, even a "principle" which holds only with one or two sigma it is worth to search for causal explanations, and even a "principle" which holds for 10 sigma can fail on a more fundamental level. That the causal explanation has to agree with the facts is clear, and for this purpose principle theories remain useful as a mathematical tool for the research for a constructive theory, which gives a causal explanation. All what one needs, quite independent of the number of sigma given by the experiments, is that the equations of the principle theory can be derived from the constructive theory in some limit.
"Principle theories, constructive theories, and explanation in modern physics." Wesley Van Camp. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 42(1), 23-31 (2011):
The interpretive work that must be done is less in coming up with a constructive theory and thereby explaining puzzling quantum phenomena, but more in explaining why the interpretation counts as explanatory at all given that it must sacrifice some key aspect of the traditional understanding of causal-mechanical explanation.
So, the search for a constructive account of Bell state entanglement is starting to sound a lot like the aether as a causal mechanism for SR.
The point being? That discussing interpretations of quantum theory is now allowed here, while discussing interpretations of relativity is yet anathema?
The point is, we have one and the same principle explanation for the mysteries of SR and QM while both are still without consensus constructive counterparts after 115 and 85 years, respectively. This is my response to ‘t Hooft’s motivation for superdeterminism.
Both have constructive counterparts.

For SR, there was always the Lorentz ether. Today, there is more, namely for the SM of particle physics, we have
Schmelzer, I. (2009). A Condensed Matter Interpretation of SM Fermions and Gauge Fields, Found. Phys. 39(1) 73-107, resp. arxiv:0908.0591.

And for GR, we have
Schmelzer, I. (2012). A Generalization of the Lorentz Ether to Gravity with General-Relativistic Limit. Advances in Applied Clifford Algebras 22(1), 203-242, resp. arxiv:gr-qc/0205035.

There is also Jacobson's Einstein aether, but it does not really look like a constructive theory, it seems it is more a technical way to find out where one can search for preferred frame effects violating GR.

For quantum theory, we have all the realist interpretations, in particular

Bohm, D. (1952). A suggested interpretation of the quantum theory in terms of hidden'' variables, Phys. Rev. 85, 166-193.

Nelson, E. (1966). Derivation of the Schrödinger Equation from Newtonian Mechanics, Phys.Rev. 150, 1079-1085

Caticha, A. (2011). Entropic Dynamics, Time and Quantum Theory, J. Phys. A44:225303, arxiv:1005.2357

All of them are constructive, given that they explain the probabilities postulating a continuous configuration space trajectory ##q(t)\in Q##, and are essentially also different, more fundamental theories (dBB because it is defined also outside quantum equilibrium, Nelson and Caticha because of the Wallstrom objection).

That there is no consensus in the sense of no acceptance by the mainstream is the result of a definite decision of the mainstream to reject or ignore them essentially without discussion. With Caticha actually the story with Bohm is repeated, with a lot of people starting with PBR proving impossible theorems for something which already has been explicitly constructed and presented. Today, ignorance is an argument, and a quite strong one.

No consensus in the sense of different proposals is what has to be expected. Principle theories restrict themselves to observable effects, constructive theories not, they propose something more fundamental which would allow to explain, in a nontrivial way, what we observe. Here, different possibilities may be expected.

PeterDonis
Mentor
in the realistic interpretations, such influence explicitly exists. If Alice measures A, and gains 1, then an immediate causal influence guarantees that after this Bob measuring A will give 1 too.

Note that, when you try to reconcile such an interpretation with relativity, more precisely with the fact that the time ordering of A's and B's measurements, since they are spacelike separated, is frame-dependent, you end up either admitting that your interpretation is not valid because it is incompatible with relativity, or claiming that Lorentz invariance is not fundamental but only emergent and so should be expected to be violated under appropriate conditions. The latter position, while it is logically consistent, has a heavy burden of proof since we have no experimental evidence whatever of Lorentz invariance being violated.