More on the thermal interpretation

  • #1

A. Neumaier

Science Advisor
Insights Author
8,379
4,317
TL;DR Summary
This thread continues the discussion of the thermal interpretation of
quantum physics, announcing Part IV of my series of papers on the subject.
I just finished a 4th part of my series of papers on the thermal interpretation:
This paper continues the discussion of the thermal interpretation of quantum physics. While Part II and Part III of this series of papers explained and justified the reasons for the departure from tradition, the present Part IV summarizes the main features and adds intuitive explanations and new technical developments.
  • It is shown how the spectral features of quantum systems and an approximate classical dynamics arise under appropriate conditions.
  • Evidence is given for how, in the thermal interpretation, the measurement of a qubit by a pointer q-expectation may result in a binary detection event with probabilities given by the diagonal entries of the reduced density matrix of the prepared qubit.
  • Differences in the conventions about measurement errors in the thermal interpretation and in traditional interpretations are discussed in detail.
  • Several standard experiments, the double slit, Stern-Gerlach, and particle decay are described from the perspective of the thermal interpretation.
Some of the material in this paper may be familiar to you from my contributions to one of the other recent threads on the subject. The analysis of the classical dynamics and the qubit, however, are new. An explanation of the relation to decoherence is given here.
 
Last edited:
  • Like
Likes julcab12, dextercioby, Mentz114 and 1 other person

Answers and Replies

  • #2
I find the following on p. 5 very interesting:

"Even before the advent of quantum mechanics, it turned out that, in classical statistical mechanics, atoms are indistinguishable not only due to practical limitations but in principle, and that there is no theoretically conceivable way to distinguish them as individuals – if it were possible, the resulting predictions would have an additional entropy of mixing, which is in conflict with the observed thermodynamical properties of bulk systems. This means that there are fundamental constraints that forbid the atom in a classical multiparticle system to have individual values."

In order for this to "directly lead" to the thermal interpretation, as the paper says (in the next sentence after what I have just quoted), doesn't there need to be an additional premise, to the effect that in order to make a measurement, some macroscopic system must be involved to which the indistinguishability argument just given can apply? After all, if we run a Stern-Gerlach experiment on a single electron, the electron is not a multi-particle system and is distinguishable from other electrons because we specifically prepared it that way. But the detector is a many-particle system, and we can't know how the experiment turned out without the detector.
 
  • #3
I also want to check my understanding of a statement on p. 6:

"Quantized measurement results (as observed angular momentum measurements) are explained by environment-induced randomness and environment-induced dissipation, as for a classical, environment-induced diffusion process in a double-well potential. Born’s statistical interpretation follows, in the limited range where it applies, from this and the deterministic rules."

So just to be clear, the thermal interpretation is not the MWI: if we make a single run of a Stern-Gerlach experiment, there is only one result. Which result it is depends on random fluctuations in the environment (which in this case would, I think, be random fluctuations in the huge number of degrees of freedom in the detector).
 
  • #4
  • It is shown how the spectral features of quantum systems and an approximate classical dynamics arise under appropriate conditions.
  • ...
Some of the material in this paper may be familiar to you from my contributions to one of the other recent threads on the subject. The analysis of the classical dynamics and the qubit, however, are new.

What prevents the TI being wholly classical?
 
  • #5
Which result it is depends on random fluctuations in the environment (which in this case would, I think, be random fluctuations in the huge number of degrees of freedom in the detector).

The predictive detector hidden variables will not generally dictated by random fluctuations, but by nonlocal hidden variables.

Consider an experiment in which an (SG-like) device A makes a non-demolition spin measurement of an N=1 beam, and in doing so diverts the spin-up component of the beam upwards, so that it enters into SG device B. Likewise, the spin-down component is diverted downwards, so that it feeds into device C instead. Devices B and C can be arbitrarily far away from A in space and time.

According to TI, there is no collapse due to measurement A, so the q-expectation value of the beam entering devices B and C is equal. But despite this, in the case that device A already clicked "up", it is now impossible for device C to click, and mandatory (up to experimental error) that device B click. Device A (or the universe as a whole) has to non-locally constrain the fluctuations of the degrees of freedom in devices and B and C, which the TI takes as predictive of the local measurement outcome. In realistic settings apart from optics, these non-local, historical constraints will dominate the behavior of the local detector HVs, rather than local environmental fluctuations.

This inter-detector holism is equivalent in function to a pilot wave, but because the TI demands the particle beams are non-collapsing and (as I understand it) not assigned their own HVs, the TI has to utilize non-locality more broadly than Bohmians. At least for the above experiment, Bohmians can get away without relying on any non-local effects. But from a philosophical perspective, if you are already comfortable with a pilot wave, I don't think this should really feel like too much of a leap.
 
  • #6
According to TI, there is no collapse due to measurement A

No, according to TI there is no measurement at all at A, because the beam never hits a detector; it just gets deflected. Which is the same thing that standard QM says.

in the case that device A already clicked "up"

Device A can't click anything because the A "up" beam does not hit a detector; it passes through a second SG apparatus. Similarly for the A "down" beam.
 
  • #7
So you are saying non-demolition spin measurements are impossible in the TI? That would be a problem because they are possible in reality. It is possible to macroscopically record the spin of the system at A and then let the system proceed to additional detectors.
 
  • #8
So you are saying non-demolition spin measurements are impossible in the TI?

No. Sorry, I missed the "nondemolition" in your post before.

they are possible in reality

Yes, but not, AFAIK, with Stern Gerlach apparatus.

As I understand it, TI has no problem with nonlocal q-expectations, and that, broadly speaking, is how it would handle the scenario you describe.
 
  • #9
So you are saying non-demolition spin measurements are impossible in the TI? That would be a problem because they are possible in reality. It is possible to macroscopically record the spin of the system at A and then let the system proceed to additional detectors.
That is speculative and seems unecessary. It is the magnitude of the spin that is quantised not the direction. The impotance of the experiment is that the two beams have equal intensity. The direction of the spin does not enter the Hamiltonian unless there is an interaction with an inhomogenous magetic field. In that case the deflection can be described by classical evolution. The separation of the beam is a demolition-neutral measurement of a classical force.
 
  • #10
As I understand it, TI has no problem with nonlocal q-expectations, and that, broadly speaking, is how it would handle the scenario you describe.

Right, this is all I was saying - that these nonlocal variables are doing a good deal of the work in ensuring it is a deterministic interpretation that conforms to QM. They're more central to this task than the local evolution of the local detector hidden variables/degrees of freedom.

Fwiw, I'd also say the same thing to a Bohmian who suggested measurements were a straightforward function of their local particle trajectories. In BM, the non-local trajectory corrections are similarly not to be understated in their importance (though often I find they are).
 
  • #11
That is speculative and seems unecessary. It is the magnitude of the spin that is quantised not the direction. The impotance of the experiment is that the two beams have equal intensity. The direction of the spin does not enter the Hamiltonian unless there is an interaction with an inhomogenous magetic field. In that case the deflection can be described by classical evolution. The separation of the beam is a demolition-neutral measurement of a classical force.

Sorry, I don't understand your point/what this has to do with the issue of how deterministic hidden variable interpretations work.
 
  • #12
Sorry, I don't understand your point/what this has to do with the issue of how deterministic hidden variable interpretations work.
Sorry, I was actually commenting (obliquely) on this
This inter-detector holism is equivalent in function to a pilot wave, but because the TI demands the particle beams are non-collapsing and (as I understand it) not assigned their own HVs, the TI has to utilize non-locality more broadly than Bohmians. At least for the above experiment, Bohmians can get away without relying on any non-local effects. But from a philosophical perspective, if you are already comfortable with a pilot wave, I don't think this should really feel like too much of a leap.
My point about the classicity is that this ensures there is always a local HV and makes the speculation about non-local guide waves irrelevant.
 
  • #13
these nonlocal variables are doing a good deal of the work in ensuring it is a deterministic interpretation that conforms to QM.

Agreed.

In BM, the non-local trajectory corrections are similarly not to be understated in their importance (though often I find they are).

Agreed.
 
  • #14
My point about the classicity is that this ensures there is always a local HV and makes the speculation about non-local guide waves irrelevant.

The HVs in this interpretation live on the various detectors, which are spatially separated. So it doesn’t make sense to say they are always local.

And my point really nothing specific to do with any details of spin. It goes through the same with abstract operators and detectors.
 
  • #15
The HVs in this interpretation live on the various detectors, which are spatially separated. So it doesn’t make sense to say they are always local.

...but because the TI demands the particle beams are non-collapsing and (as I understand it) not assigned their own HVs, ...
I'm referring to an HV for each particle. Like you, I'm not sure if the TI allows this but it predicts the correct outcomes.

And my point really nothing specific to do with any details of spin. It goes through the same with abstract operators and detectors.
Hmm. Probably too abstract for me so I'll refrain from commenting further.
 
  • #16
"This means that there are fundamental constraints that forbid the atom in a classical multiparticle system to have individual values."

In order for this to "directly lead" to the thermal interpretation, as the paper says (in the next sentence after what I have just quoted)
Well, it leads directly only to the main innovation of the thermal interpretation, that expectations are beables. Of course, classical statistical mechanics says nothing about quantum mechanics, and hence nothing about the thermal interpretation of it. Thus I'd have been more careful in the formulation...

doesn't there need to be an additional premise, to the effect that in order to make a measurement, some macroscopic system must be involved to which the indistinguishability argument just given can apply? After all, if we run a Stern-Gerlach experiment on a single electron, the electron is not a multi-particle system and is distinguishable from other electrons because we specifically prepared it that way. But the detector is a many-particle system, and we can't know how the experiment turned out without the detector.
Yes, in a Stern-Gerlach experiment, a lone electron in the beam is distinguished. Without the detector no measurement.

But it is not the indistinguishability of the detector particles that causes the measurement but the fact that the energy from the impact of the silver is dissipated into detector oscillations (phonons) that leave the impact region and hence have an irreversible asymptotic effect, similar to the one discussed for the pointer in Section 3 of Part IV. Because of the bilocality, the details are more complicated than for the pointer discussed there, and I haven't yet worked them out, but the principle is the same.
 
Last edited:
  • #17
I also want to check my understanding of a statement on p. 6:

"Quantized measurement results (as observed angular momentum measurements) are explained by environment-induced randomness and environment-induced dissipation, as for a classical, environment-induced diffusion process in a double-well potential. Born’s statistical interpretation follows, in the limited range where it applies, from this and the deterministic rules."

So just to be clear, the thermal interpretation is not the MWI: if we make a single run of a Stern-Gerlach experiment, there is only one result. Which result it is depends on random fluctuations in the environment (which in this case would, I think, be random fluctuations in the huge number of degrees of freedom in the detector).
According to the thermal interpretation, if we make a single run of a Stern-Gerlach experiment, there is indeed only one result. Which result it is (i.e., whether a silver atom is deposited at the left or the right spot) depends on in practice unpredictable details in the deterministic environment - the whole universe apart from the beam and the two detector spots. For all practical purposes, these are random fluctuations.
 
Last edited:
  • Like
Likes dextercioby
  • #18
What prevents the TI being wholly classical?
In a classical world governed by the thermal interpretation, the particle operators (in a discrete world) or field operators (in a continuum world) would commute.
 
  • #19
The separation of the beam is a demolition-neutral measurement of a classical force.
No, unless you measure one of the resulting two beams. The separation of the beam is a unitary transformation in the full state space of the particle, hence no measurement is involved.

I'm referring to an HV for each particle. Like you, I'm not sure if the TI allows this but it predicts the correct outcomes.
There are local variables at each detector position and nonlocal variables at pairs of detector positions. In addition, the particle in the beam is fully described only by nonlocal variables.
 
  • #20
No, unless you measure one of the resulting two beams. The separation of the beam is a unitary transformation in the full state space of the particle, hence no measurement is involved.
So no work is done on the particles during the separation and it is reversible ? I could believe that if the magnetic field was homogenous.

There are local variables at each detector position and nonlocal variables at pairs of detector positions. In addition, the particle in the beam is fully described only by nonlocal variables.
Thanks for the clarification.
 
  • #21
So no work is done on the particles during the separation and it is reversible ? I could believe that if the magnetic field was homogenous.
It is reversible even classically. Mechanical work is done, but in a reversible way.

The irreversible stuff happens upon impact at the screen.
 
  • #22
It is reversible even classically. Mechanical work is done, but in a reversible way.

The irreversible stuff happens upon impact at the screen.
That clears up my problem, I guess.

I found this paper in a journal called 'Entropy' which is peer reviewed

A Quantum Description of the Stern–Gerlach Experiment
Håkan Wennerström and Per-Olof Westlund
https://www.mdpi.com/1099-4300/19/5/186/pdf
They appear to use q-expectations without calling them by that name. They claim that their conclusions are the same as those of other more conventional approaches ( references are given in the above).
Do you have any thoughts on their approach ?
 
  • #23
They appear to use q-expectations without calling them by that name. They claim that their conclusions are the same as those of other more conventional approaches ( references are given in the above).
Do you have any thoughts on their approach ?
They interpret the q-expectations in the conventional way; see p.4 after (14).

Note that books and articles on quantum physics are full of computations with and of q-expectations. But their interpretation has in the past always been in terms of an ensemble mean over many independent realizations. In the thermal intepretation, the same computations are given a different meaning.

By the way, a nice paper about the history of the Stern-Gerlach experiment and its reception and interpretation is given in https://arxiv.org/pdf/1609.09311.pdf
 
  • #24
I noticed a very bold claim in the third paper on measurement.
"claims that the unmodeled environment influences the results enough to cause all randomness in quantum physics. "

I really like the thermal interpretation for the main reason that humans can only manipulate macroscopic objects, and our ability to control microscopic objects is manifest though our ability to manipulate macroscopic objects and however I find this conclusion very farfetched and probably wrong.

Why do you think you can make such a claim? How is this implied by the Thermal Interpretation?
 
  • #25
I noticed a very bold claim in the third paper on measurement.
"claims that the unmodeled environment influences the results enough to cause all randomness in quantum physics. " [...]
How is this implied by the Thermal Interpretation?
The quoted sentence is from the abtract of Part III. The justification is a cumulative argument developed in in Sections 4.2, 4.3, 5.1 and 5.2.

Section 4.2 discusses why all our knowledge about consequences of the quantum formalism for macroscopic objects and hence for measurement comes from coarse-graining the microscopic dynamics of all q_expectations to an effective dynamics for a much smaller collection of relevant quantities.

Section 4.3 demonstrates how coarse graining the linear dynamics of q-expectations produces a nonlinear, generally chaotic dynamics of the relevant quantities. (For example, turbulent Navier-Stokes equations.)

Section 5.1 shows that results by Breuer & Pettrucione may be reinterpreted in the thermal interpretation as implying that the piecewise deterministic processes for some well-known model quantum systems are obtainable by coarse-graining. The last paragraphs reads:
Section 5.1 of Part III said:
The arguments show that to go from unitarity to irreversible discrete events in Hamiltonian quantum mechanics one does not need to assume more than to go from reversibility to irreversibility in Hamiltonian classical mechanics – namely a suitable form of the Markov approximation. Statistical assumptions are not needed to make pointers acquire a well- defined position or to create photocurrents – the standard dissipation arguments are enough. This gives stochastic equations for definite macroscopic outcomes.

Finally, Section 5.2 gives an intuitive reason that allows a better understanding of these technical developments in B&P. It is shown that the behavior discussed is fully analogous to how chaotic nonlinear phenomena in classical chaotic systems lead through bistability to binary outcomes for the observation of continuous outcomes, like in the tossing of a coin, where the common understanding is that the randomness in the unmodeled envionment (i.e., details about how the coin is thrown and lands before it settles into one of two possible equilibrium states) influences the results enough to cause all randomness in the coin toss. The coin toss is paradigmatic for the way in which randomness appears through coarse-graining in classical physics. It is generally believed to be the only such mechanism, without discussing each individual case again.

By analogy, the randomness in the unmodeled envionment (i.e., details about how the state of system plus environment behaves before it settles into one of two possible collapsed states) influences the results enough to cause all randomness in the observation of a qubit.

This is paradigmatic for all quantum processes, and leads to the assertion in the abstract.
 

Suggested for: More on the thermal interpretation

Replies
5
Views
1K
Replies
6
Views
1K
Replies
9
Views
358
Replies
17
Views
2K
Replies
42
Views
3K
Replies
1
Views
1K
Replies
63
Views
5K
Replies
12
Views
2K
Replies
46
Views
3K
Replies
2
Views
1K
Back
Top