# I The thermal interpretation of quantum physics

#### charters

I think we eventually clarified that the TI is not superdeterministic, but since there seems to still be confusion around what this means, I have developed a simple thought experiment to distinguish these two types of hidden variable approaches.

Consider an experiment of two sources (X, Y) of polarization entangled photons, which are anti correlated, and 4 polarization detectors (arranged from left to right as A, B, C, D), where A, B are set up to measure the photons from X. C, D measure photons from Y. Now imagine that detectors B and C are on a lazy susan gear contraption, where the experimenter can turn a crank, and B and C will switch positions, so now X photons go to A & C, and Y photons go to B & D.

Finally, imagine a demon with access to the hidden variable information that perfectly predicts each detector measurement.

Now, regardless of whether the detectors are ordered ABCD or ACBD, the measurement results from left to right can only be one of the following: HVVH, HVHV, VHHV, VHVH where H = horizontal, V = vertical. This is due to the prepared Bell states.

We will now focus on the case where i) the detectors were originally arranged as ABCD, ii) the photons are emitted at t=0, iii) the experimenter turns the crank at t=1, changing the order to ACBD, and iv) the measurement at t=2 reveals the result HVHV.

Finally, we ask the demon to tell us the status of the hidden variables at t=0 and t=1 in this postselected subset of runs. As i the TI, the hidden variables here are assigned to the detectors, not the photons.

In a nonlocal deterministic interpretation, the demon will say that at t=0 the hidden variables were arbitrary, but that by t=2 they were nonlocally steered into the right correlations. For example, it could be that at t=0, the hidden variables predicted HVHV, but then the crank rotation at t=1 turned this into HHVV. Since HHVV is not a possible outcome, then between t=1 and t=2, the nonlocal beables had to steer the B and C local beables, so the prediction returned back to HVHV.

In a superdeterministic interpretation, the demon will instead say that at t=0, the hidden variable state was always, in every run, HHVV, and it was in fact the crank rotation itself that set it to the acceptable HVHV. There is no nonlocal beable to nudge or steer or correct the hidden variables here, so they had to be fine tuned from the start, as if they already knew the future experimental procedure.

In short, superdeterminism means the hidden variables are fine tuned to always *exploit the crank* in order to deliver correct quantum correlations, seemingly knowing in advance that the crank will be turned, or being astoundingly lucky. In contrast, nonlocality means hidden variables exist to *fix the situation when the crank spoils* the correct quantum correlations.

Retrocausal approaches are sort of a compromise of these two approaches.

#### Auto-Didact

The concept of predeterminism consists of a much larger and wider class of theories and explanations rather than just merely what is called superdeterminism; if the TI actually conforms to any other implementation of predeterminism apart from superdeterminism then this would render the point made above through the thought experiment completely obsolete.

#### charters

The concept of predeterminism consists of a much larger and wider class of theories and explanations rather than just merely what is called superdeterminism; if the TI actually conforms to any other implementation of predeterminism apart from superdeterminism then this would render the point made above through the thought experiment completely obsolete.
I wasn't making an argument, just illustrating nonlocal vs superdeterministic HVs. I've never heard of an interpretation of QM with some alternative "predeterministic" HVs, which would tell a different story than the two I sketched above. Can you give me a concrete example of an interpretation of QM with "predeterministic but not superdeterministic" HVs?

#### eloheim

No. The correlations are caused by the interaction - without interaction there is of course no measurement result. The beam produces a bilocal field characterized (among others) by local and bilocal beables, namely the q-expectations of $A(x)$, $B(y)$, $A(x)B(y)$ and $B(y)A(x)$ at spacetime positions $x$ and $y$. When reaching the detectors at $x$ and $y$ (including the prepared controls manipulated by Alice and Bob, respectively), these interact according to the deterministic, covariant dynamics of the system beam+detector and result in correlated local measurement results at $x$ and $y$, depending on these controls.

Thus the explanation is similar to that that you accepted as explanatory in the other interpretations:
Maybe I'm wrong but I thought the TI says that in such a bell test there is an electromagnetic beam of expectation value = 0. The detectors are basically in a metastable state where tiny random perturbations from the environment will break the symmetry and knock the detector into one of the two possible measurement results (pseudo)randomly. It seems to me that if what the TI calls measurement errors are part of the beam in the first place, then how can you really say the beables are EVs and not regular eigenstates?

Having read a significant portion of this thread though, it feels to me that the TI is a form of superdeterminism, wherein even what seems to be truly random (i.e. measurement outcomes as dictated by the Born rule) is actually completely a deterministic consequence of the initial condition of the universe, in conjunction with a novel 'thermalization' scheme for generating quasiprobabilities.
This is where I was going with my line of questioning but I wasn't sure and didn't want to presume anything. I understood the TI to coordinate the microstates of the detectors on each side of a bell-type experiment through either special conditions in the initial state of the universe (which I don't think most here would consider to be a satisfactory solution for this type of interpretation), or maybe through some kind of acausal or 4D constraints on the total evolution of the universe (in other words the universe "sniffs out" an acceptable path between the initial and final states, and "chooses" one that respects its physical laws).

I don't find either of these answers to be compelling when it comes to the TI, however, because the interpretation makes no mention of any additional constraints or special initial conditions, whatsoever.

#### A. Neumaier

It seems to me that if what the TI calls measurement errors are part of the beam in the first place
They are not part of the beam (which is in a stable state as long as it meets no obstacle) but results of the interaction beam+detector, which is in an unstable state and hence moves into a random one among the stable directions. The randomness appears precisely when the instability begins, and the measurement error is due to the fact that an instability magnifies tiny random fluctuations to much larger motions. Thus the errors an effect of the measurement process, and not a property of the beam.
the TI to coordinate the microstates of the detectors on each side of a bell-type experiment through [...]
Through neither of these. The nonlocal dynamics depends on the local and bilocal input just prior to the beginning of the interaction, and because it is unstable it forces the combined state (and in particular the pointer q-expectation measured) into one of a few preferred pathways, just like a particle at a saddle of a 2D potential moves into one of the few valleys accessible from the saddle.

#### A. Neumaier

So far, the TI still seems to be more deterministic than other physical theories are;
See the preceding post #680. What is here more determinstic than in classical mechanics?

Last edited:

#### Auto-Didact

I've never heard of an interpretation of QM with some alternative "predeterministic" HVs, which would tell a different story than the two I sketched above. Can you give me a concrete example of an interpretation of QM with "predeterministic but not superdeterministic" HVs?
Predeterminism is a concept much broader than QT, or even physical theory i.e. physics as a whole for that matter. Superdeterminism in the QT literature is one specific mathematical implementation of the far broader concept of predeterminism. I'm not aware of any other specific mathematical implementations that are popular or frequently referenced in modern physics literature.

Note however that this in no way implies that superdeterminism is the sole possible unique implementation of predeterminism, nor does it imply that any other qualitatively different - i.e. based in different forms of mathematics - implementations of predeterminism can not exist; to paraphrase Feynman, to presume the opposite would be wagging the dog by the tail.
See the preceding post #682. What is here more determinstic than in classical mechanics?
Classical mechanics doesn't depend on carefully tuned initial conditions of the universe neither does classical mechanics determine the initial condition of the universe. As you know - and I would argue probably understand better than most - this is because classical analytical mechanics is essentially purely an applied mathematical model which can be reduced to a set of reversible differential equations.

It isn't clear to me whether or not the TI depends on either of those factors (fine tuned initial conditions or determining the initial conditions); it actually isn't necessarily problematic if there is an underlying dynamical model, but in either case I would be interested to know if the TI does depend on such factors. As I said I haven't read the papers yet, I plan to do that the coming week.
They are not part of the beam (which is in a stable state as long as it meets no obstacle) but results of the interaction beam+detector, which is in an unstable state and hence moves into a random one among the stable directions. The randomness appears precisely when the instability begins, and the measurement error is due to the fact that an instability magnifies tiny random fluctuations to much larger motions. Thus the errors an effect of the measurement process, and not a property of the beam.
This sounds extremely similar to Penrose's proposal; in fact you merely need to replace "interaction beam+detector" with "superposed gravitational fields of the interacting system" and the two proposals are indistinguishable.

#### A. Neumaier

Classical mechanics doesn't depend on carefully tuned initial conditions of the universe neither does classical mechanics determine the initial condition of the universe.
The same holds for quantum mechanics in the TI. If things are sufficiently well isolated one can consider small systems and replace everything else as in classical mechanics by a standard (but quantum) heat bath, as always done in statistical mechanics. One just needs something that carries matter or energy a macroscopic distance away from where the interaction happens, to get the required dissipation.
if there is an underlying dynamical model
There is, namely the Ehrenfest dynamics for q-expectations. See Section 2.1 of Part II.
As I said I haven't read the papers yet, I plan to do that the coming week.
Maybe you should wait with insinuating remarks until you read it.
This sounds extremely similar to Penrose's proposal; in fact you merely need to replace "interaction beam+detector" with "superposed gravitational fields of the interacting system" and the two proposals are indistinguishable.
Yes, there is some similarity. But
1. the electromagnetic field is fully sufficient to achieve that;
2. Penrose didn't propose the reality of q-expectations.

#### charters

I'm not aware of any other specific mathematical implementations that are popular or frequently referenced in modern physics literature.

Note however that this in no way implies that superdeterminism is the sole possible unique implementation of predeterminism, nor does it imply that any other qualitatively different - i.e. based in different forms of mathematics - implementations of predeterminism can not exist; to paraphrase Feynman, to presume the opposite would be wagging the dog by the tail.
If you admit you can't propose a concrete implementation of "predeterminism" distinct from the types of HV models I outlined, then what is the merit of this claim? The burden is on you to affirmatively establish the existence of this alternative before anyone needs to worry about it. Otherwise, you are just saying "well, maybe there's something nobody has ever thought of" which is just a generic exception to a huge class of claims across all topics.

#### Auto-Didact

The same holds for quantum mechanics in the TI. If things are sufficiently well isolated one can consider small systems and replace everything else as in classical mechanics by a standard (but quantum) heat bath, as always done in statistical mechanics.
There is, the Ehrenfest dynamics for q-expectations. See Section 2.1 of Part II.
The above two statements (seem to) contradict each other: if the TI is capable of completely being reduced to - not merely highly accurately be approximated by - a reversible DE or any other similar pure object/function, then it cannot also simultaneously dynamically determine the initial conditions of the universe: orthodox QM is essentially incapable of the latter.
Maybe you should wait with insinuating remarks until you read it.
Maybe, but I don't want to miss out on all the fun going on here before hermetically focusing on that task.
Yes, there is some similarity. But
1. the electromagnetic field is fully sufficient to achieve that;
2. Penrose didn't propose the reality of q-expectations.
1. Yes, except that 1a) gravitational interactions cannot be shielded and 1b) all detection events are always above the one graviton level, thereby automatically making the 'EM sufficiency argument' itself insufficient as a definitive argument.

In either case, to fully definitively resolve this matter actually requires statistical analysis of the experiment across a large range of experimental parameters to make a proper scientific distinction.

2. This is true, which is exactly why the TI is rather attractive: your underlying mathematics might naturally function as a possible completion of Penrose' admittedly incomplete conceptual model!

In fact, positing the source of the instability of the states as always fundamentally resulting from the instability of gravitational superposed fields associated to quanta is not inconsistent with your proposal at all.

Moreover, if the IC of the universe are in fact dynamically determined in the TI, this more gravitationally thermalizing picture might also form a natural route to unifying the Weyl curvutare hypothesis with the TI.

#### Auto-Didact

If you admit you can't propose a concrete implementation of "predeterminism" distinct from the types of HV models I outlined, then what is the merit of this claim? The burden is on you to affirmatively establish the existence of this alternative before anyone needs to worry about it. Otherwise, you are just saying "well, maybe there's something nobody has ever thought of" which is just a generic exception to a huge class of claims across all topics.
I'm not proposing anything or admitting anything of the sort, I'm saying that mathematically speaking the issue is genuinely open; simply pretending that it isn't, merely because it hasn't yet is de facto a hypothesis not belonging to the methodology of fundamental physics, but instead belonging to a general strategic scientific methodology focusing more on getting simple shortterm results instead of more difficult longterm ones (i.e. fundamental matters of principle).

To propose otherwise literally requires giving an (in)existence proof for the conjecture 'superdeterminism = predeterminism': in other words, this very general discussion automatically has lead to the very specific statement for which a proof can or can not be given and this challenge may of course be handed off to interested mathematicians and logicians, making the discussion far more productive than it seems from a more naive shortterm focus, whether that was our intention or not.

#### A. Neumaier

Classical mechanics doesn't depend on carefully tuned initial conditions of the universe neither does classical mechanics determine the initial condition of the universe.
The same holds for quantum mechanics in the TI.
it cannot also simultaneously dynamically determine the initial conditions of the universe
This is a misunderstanding: I never claimed that; see the quotes that I just repeated. The initial conditions are arbitrary, as in each dynamical system.

#### Auto-Didact

This is a misunderstanding: I never claimed that; see the quotes that I just repeated. The initial conditions are arbitrary, as in each dynamical system.
Understood, which means that the answer to
It isn't clear to me whether or not the TI depends on either of those factors (fine tuned initial conditions or determining the initial conditions); it actually isn't necessarily problematic if there is an underlying dynamical model
is actually definitively negative, i.e. the IC of the universe aren't determined by Ehrenfest dynamics in the TI; alas, I got my hopes up due to post #683.

#### PeterDonis

Mentor
I don't want to miss out on all the fun going on here before hermetically focusing on that task.
Since this thread is specifically about the thermal interpretation, this is the wrong attitude to take. You need to be familiar with what the interpretation says in order to discuss it. Since you have admitted you are not familiar with what the interpretation says, you are now banned from further posting in this thread.

"The thermal interpretation of quantum physics"

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving