I The thermal interpretation of quantum physics

  • #651
A. Neumaier said:
In the wave function, there is no actual motion of particles, just a motion of probability amplitude.
Moreover, everything happens in a fixed frame in which the spatial Fourier transform is performed; one cannot argue with time dilation or length contraction.

From the Schrödinger equation, the support of ##\psi## is ,after sufficiently short but positive time, definitely the union of the initial support and that of ##H\psi##. But there is no reason to suppose that the fairly arbitrary ##H## allowed by the construction in K/P leads to an ##H\psi## with bounded support; note that ##M## can be quite arbitrary. To preserve the relativistic probability interpretation in concrete cases, one would have to construct very special ##M## that preserve a bounded support - but this seems quite a nontrivial mathematical task.
Again, the Green function for the Dirac equation is causal. So I don't see how your Section 3.3 contains the argumentation.

And what is K/P? I guess I missed something.
 
Physics news on Phys.org
  • #652
akhmeteli said:
Again, the Green function for the Dirac equation is causal. So I don't see how your Section 3.3 contains the argumentation.

And what is K/P? I guess I missed something.
K/P is Keister & Polyzou. I am not discussing the Dirac equation, which has acausal solutions!
 
  • #653
A. Neumaier said:
I am not discussing the Dirac equation, which has acausal solutions!
What acausal solutions of the Dirac equation do you have in mind? And again, if the Dirac equation has problems, why is this the Born's rule's problem? Again, I fail to find the argumentation in your Section 3.3.
 
  • #654
akhmeteli said:
What acausal solutions of the Dirac equation do you have in mind? And again, if the Dirac equation has problems, why is this the Born's rule's problem? Again, I fail to find the argumentation in your Section 3.3.
The revised version of Part I no longer mentions the Dirac equation, hence there is no point bringing it up again.

My argument is solely about the consistent relativistic multiparticle dynamics in K/P.
 
  • #655
A. Neumaier said:
I am not discussing the Dirac equation, which has acausal solutions!
What acausal solutions of the Dirac equation do you have in mind?
A. Neumaier said:
The revised version of Part I no longer mentions the Dirac equation, hence there is no point bringing it up again.

My argument is solely about the consistent relativistic multiparticle dynamics in K/P.
Again, I have yet to understand your arguments in post 646, but you also stated there:
A. Neumaier said:
The argumentation is completely contained in point 4 of Subsection 3.3 of my Part I.


I explained that there is no valid argumentation in point 4 of Subsection 3.3, as the argumentation in footnote 16 is not valid for the Dirac equation, and that is how the Dirac equation is relevant. Note that point 4 does not mention K/P, and footnote 16 has nothing to do with multiple particles. The reference to K/P in point 5 is just that, a reference to a huge article, so there is no valid argumentation in your arxiv article. Again, maybe the reasoning in your post 646 is correct, and I will try to understand it, but there is no argumentation in the article for your statement about relativistic particles in point 5.
 
  • #656
akhmeteli said:
If momentum is large, the speed still cannot exceed the velocity of light in the relativistic case.


Not if the momentum is timelike, no. But in quantum field theory, there is a nonzero amplitude for the momentum to be spacelike (at least, that's one way of describing what the math says).
 
  • #657
PeterDonis said:
Not if the momentum is timelike, no. But in quantum field theory, there is a nonzero amplitude for the momentum to be spacelike (at least, that's one way of describing what the math says).
I referred to spatial momentum, in a fixed frame, in a covariant multiparticle setting defined in the papers cited. It lacks the cluster decomposition property and is not easily related to quantum field theory.
 
  • #658
PeterDonis said:
Not if the momentum is timelike, no. But in quantum field theory, there is a nonzero amplitude for the momentum to be spacelike (at least, that's one way of describing what the math says).
While we were not discussing QFT, could you please specify what situation or result you have in mind?
 
  • #659
akhmeteli said:
could you please specify what situation or result you have in mind?

It's a fairly general statement about QFT. Another way of stating it is that if you look at the Feynman propagators for various quantum fields, they won't vanish for a pair of events that are spacelike separated. The key property that preserves causality is that field operators at spacelike separated events commute. See, for example, the discussion in sections 2.6.1 and 2.7 of these lectures:

http://www.damtp.cam.ac.uk//user/tong/qft.html
 
  • #660
PeterDonis said:
It's a fairly general statement about QFT. Another way of stating it is that if you look at the Feynman propagators for various quantum fields, they won't vanish for a pair of events that are spacelike separated. The key property that preserves causality is that field operators at spacelike separated events commute. See, for example, the discussion in sections 2.6.1 and 2.7 of these lectures:

http://www.damtp.cam.ac.uk//user/tong/qft.html
Well, this is indeed a specific property of QFT. In this case one cannot be sure that the particle detected outside the light cone is the same particle that was created initially. So I am not sure this is quite relevant in the context of A. Neumaier's critique of the Born's rule (see also his post 657).
 
  • #661
akhmeteli said:
if the Dirac equation has problems, why is this the Born's rule's problem?
The only problem is that you apply a faulty equation to the reasoning in my paper. The K/P Hamiltonians have no such problems.
 
  • #662
A. Neumaier said:
The argumentation is completely contained in point 4 of Subsection 3.3 of my Part I. The Keister-Polyzou paper just contains dynamical relativistic examples. If you want a definite example, you may take the example of spinless quarks in Section 2.3 (p.26 in the copy cited in post #642). But the details do not matter.

The only relevant points for my argument are that, although the setting is Poincare-covariant,
  1. the wave function at fixed time is a function of several spatial momenta, which after Fourier transform to the position representation becomes wave function that is a function of spatial positions,
  2. Born's rule makes claims about the probabilities of measuring,
  3. the Hamiltonian and the position operators have a nonlocal commutator.
As a result, the dynamics introduces (as claimed in Part I) after arbitrarily short times nonzero probabilities of finding an initially locally prepared particle (initial wave function with compact support), at almost any other point in the universe.

Thus the position probability interpretation itself contradicts the principles of relativity!
I failed to understand how the example in K/P is relevant. Could you please explain?

I understand items 1 and 2 in your quoted post. I don't understand item 3 as I don't know what Hamiltonian you have in mind. (Neither do I understand how you get your conclusion "
the dynamics introduces ... after arbitrarily short times nonzero probabilities of finding an initially locally prepared particle", but maybe this will be clearer after you explain item 3)
Could you please explain? Thank you.
 
  • #663
Do you write "Many-Worlds Interpretation" or "many-worlds interpretation"?

Why did you choose "thermal interpretation" instead of "Thermal Interpretation"?
 
  • #664
Last edited:
  • Like
Likes atyy
  • #665
I’ve been following some of the discussion on here about the Thermal Interpretation, and I’ve reviewed the linked papers. I like the idea of positing expectation values, rather than eigenstates, as primary, and taking something like a fluid mechanics approach to quantum physics. In the end though I can’t understand how the TI is supposed to solve the problems of other interpretations, beyond merely decreeing, “It’s resolved. Don’t worry about it.”

Say we’re doing a standard bell-type test on entangled photons, with anticorrelted results for measurements with identical orientations. We will put one of the detectors (Bob’s) millions of miles away in space for clarity. According to the thermal interpretation, the perfectly anticorellated results we get for each entangled pair is NOT due to any property of beams themselves (aka what other interpretations call the individual particles), but instead due to FAPP unknowable details of the macroscopic detectors.

Alice’s detector is near the photon source and is ready to take a measurement immediately. Bob, on the other hand, doesn’t even start building his detector until his photon is well en route. Despite the fact that detector A(lice) was already in existence when the experiment began, and detector B(ob) wasn’t assembled until well after Alice’s particle was measured, each detector happens to be composed in such a way that they “magically” give opposite results for measurements of the beam performed at the same angle. These two detectors have nothing in common; they were produced completely separately, millions of miles apart and at different times, and yet essentially their random micro-details conspire in intimate coordination!

(To add one more layer to further illustrate the point, say Bob built 3 detectors instead of one. At the last minute before detection he directs his photon to one of the three chosen at random (say conditioned on a photon from a distant star in the opposite direction from Alice). At the same time, he also does his own (completely unrelated) Bell experiment, using the other two detectors. We now have a situation where the microscopic details of just one of the detectors is, again for seemingly no reason, aligned in such a way as to produce perfect anticorrelation with Alice, while the other two detectors happen to be composed such that they correlate with one another instead. And yet we can randomly switch which detector is used for what purpose at the last moment with no ramifications. How is this at all plausible?)

Essentially my question is, “how are these coincidences explained by the TI?” Other interpretations point to the entangled particles, or a pilot wave, for example, but the TI doesn’t have that same luxury. If this was a foundations-agnostic interpretation that presented itself as (just) a new tool for calculation, I wouldn’t complain, but the TI is billed as solving the measurement problem and fixing quantum physics’ outstanding contentions. IMHO resolving quantum foundations via fiat doesn’t do the trick.
 
  • #666
eloheim said:
I can’t understand how the TI is supposed to solve the problems of other interpretations, beyond merely decreeing, “It’s resolved. Don’t worry about it.”
eloheim said:
Essentially my question is, “how are these coincidences explained by the TI?”
No interpretation explains this, except by voicing the mantra ''nonlocality''. How Nature manages to realize these nonlocal coincidences is a secret of its creator. Bell's analysis just shows that one needs explicitly nonlocal beables if one wants to avoid all sorts of other weird assumptions. Nonlocal beables depends on simultaneous values at very far away points, and once one acknowledges that these influence local beables, nonlocal correlations between the latter are explained (in some sense).

The TI is nonlocal enough to be consistent with these findings. Moreover, it explains why there is nothing to worry about, since the usual worry has to do with a seeming incompatibility with special relativity. But there is no such incompatibility, as discussed in Subsection 4.4-4.5 of Part II. Thus Nature can consistently be relativistic and have nonlocal coincidences of Bell type.
eloheim said:
Other interpretations point to the entangled particles, or a pilot wave
How does this pointing provide an explanation? It only says that certain dynamical calculations leads to the result, but does not explain the result, unless calculation is deemed explanation. (But the same calculations then work for the TI.)
eloheim said:
the TI is billed as solving the measurement problem
The problem you posed in your post is not the measurement problem but the nonlocality puzzle.

The measurement problem is the problem of why there are unique and discrete outcomes for single quantum systems although the wave function produces only superpositions of (measurement,detector state) pairs. This problem is solved by the TI; see Subsection 5.1 of Part III and Section 3 of Part IV.
 
  • #667
A. Neumaier said:
How Nature manages to realize these nonlocal coincidences is a secret of its creator.
You are being sarcastic ,right? You don't actually believe what you said as a physicist, do you?:smile:
 
  • #668
ftr said:
You are being sarcastic ,right? You don't actually believe what you said as a physicist, do you?:smile:
Physicists lifted many secrets of Nature but not (yet?) this one.
 
  • #669
A. Neumaier said:
The problem you posed in your post is not the measurement problem but the nonlocality puzzle.

The measurement problem is the problem of why there are unique and discrete outcomes for single quantum systems although the wave function produces only superpositions of (measurement,detector state) pairs. This problem is solved by the TI; see Subsection 5.1 of Part III and Section 3 of Part IV.
You are absolutely right about this; my apologies for a poor choice of words. I was trying to suggest that the TI has the same issues with how it treats (or rather doesn't treat) such foundational problems in general. However, I didn't defend any other criticisms in my post (and I'm not sure that I could) so please disregard my mention of the measurement problem if you will.
A. Neumaier said:
How does this pointing provide an explanation? It only says that certain dynamical calculations leads to the result, but does not explain the result, unless calculation is deemed explanation. (But the same calculations then work for the TI.)
The difference is that, according to most interpretations, the singlet state (i.e. the two entangled particles) is a nonlocal beable, so when you do something to one of the particles (like measure spin) the other is affected accordingly. This, combined with the local detector settings on each end of the experiment, is enough to produce the desired QM statistics. In this case the detectors would be entangled only after interacting with the particles, and no nonlocal correlations at all between the detectors themselves are necessary to explain the phenomenon. Conversely, the TI wants to say that the correlations are in the detectors from the beginning, and are only acting on noise in the particle beam.
 
  • #670
eloheim said:
Conversely, the TI wants to say that the correlations are in the detectors from the beginning, and are only acting on noise in the particle beam.
No. The correlations are caused by the interaction - without interaction there is of course no measurement result. The beam produces a bilocal field characterized (among others) by local and bilocal beables, namely the q-expectations of ##A(x)##, ##B(y)##, ##A(x)B(y)## and ##B(y)A(x)## at spacetime positions ##x## and ##y##. When reaching the detectors at ##x## and ##y## (including the prepared controls manipulated by Alice and Bob, respectively), these interact according to the deterministic, covariant dynamics of the system beam+detector and result in correlated local measurement results at ##x## and ##y##, depending on these controls.

Thus the explanation is similar to that that you accepted as explanatory in the other interpretations:
eloheim said:
according to most interpretations, the singlet state (i.e. the two entangled particles) is a nonlocal beable, so when you do something to one of the particles (like measure spin) the other is affected accordingly. This, combined with the local detector settings on each end of the experiment, is enough to produce the desired QM statistics. In this case the detectors would be entangled only after interacting with the particles, and no nonlocal correlations at all between the detectors themselves are necessary to explain the phenomenon.
 
  • #671
A. Neumaier said:
No interpretation explains this, except by voicing the mantra ''nonlocality''. How Nature manages to realize these nonlocal coincidences is a secret of its creator.
The reference to not just QM but Nature itself being nonlocal is an implication that QM is - as the realists claim - an incomplete theory which can be completed by creating the correct mathematization of the concept of nonlocality. This means inventing or identifying the branch of mathematics for this concept, and then applying that branch of mathematics to QM such that QM itself may be reformulated in this new mathematical language which naturally captures and explicitizes the nonlocality in a mathematically useful form, in the hope that this will naturally lead to a completion of QM.
 
  • #672
Auto-Didact said:
such that QM itself may be reformulated in this new mathematical language which naturally captures and explicitizes the nonlocality in a mathematically useful form, in the hope that this will naturally lead to a completion of QM.
Well, my claim is that the thermal interpretation does just this!
 
  • Like
Likes Auto-Didact
  • #673
A. Neumaier said:
Well, my claim is that the thermal interpretation does just this!
I'm aware that you think that and I applaud your effort. I haven't had enough time to chew on the TI yet, haven't read the three papers yet in depth.

Having read a significant portion of this thread though, it feels to me that the TI is a form of superdeterminism, wherein even what seems to be truly random (i.e. measurement outcomes as dictated by the Born rule) is actually completely a deterministic consequence of the initial condition of the universe, in conjunction with a novel 'thermalization' scheme for generating quasiprobabilities.
 
  • #674
Auto-Didact said:
Having read a significant portion of this thread though, it feels to me that the TI is a form of superdeterminism, wherein even what seems to be truly random (i.e. measurement outcomes as dictated by the Born rule) is actually completely a deterministic consequence of the initial condition of the universe, in conjunction with a novel 'thermalization' scheme for generating quasiprobabilities.
What you state here makes the TI a form of determinism (which it is). Superdeterminism, I was told, is a more specific label that does not apply to the TI.
 
  • Like
Likes mattt
  • #675
A. Neumaier said:
What you state here makes the TI a form of determinism (which it is). Superdeterminism, I was told, is a more specific label that does not apply to the TI.
It isn't clear to me from that post alone that the TI actually is or is not superdeterministic, nor whether it conceptually adheres to any other form of predeterminism which has yet to be mathematicized. So far, the TI still seems to be more deterministic than other physical theories are; that isn't a good thing.

It goes without saying that accepting any form of predeterminism is not merely a death blow to some theory, but one to science itself, since then the very idea of experimental verification and falsification would turn out to be so hopelessly misguided that it would render the entire human scientific enterprise as a completely ridiculous enormous waste of time.
 
  • #676
I think we eventually clarified that the TI is not superdeterministic, but since there seems to still be confusion around what this means, I have developed a simple thought experiment to distinguish these two types of hidden variable approaches.

Consider an experiment of two sources (X, Y) of polarization entangled photons, which are anti correlated, and 4 polarization detectors (arranged from left to right as A, B, C, D), where A, B are set up to measure the photons from X. C, D measure photons from Y. Now imagine that detectors B and C are on a lazy susan gear contraption, where the experimenter can turn a crank, and B and C will switch positions, so now X photons go to A & C, and Y photons go to B & D.

Finally, imagine a demon with access to the hidden variable information that perfectly predicts each detector measurement.

Now, regardless of whether the detectors are ordered ABCD or ACBD, the measurement results from left to right can only be one of the following: HVVH, HVHV, VHHV, VHVH where H = horizontal, V = vertical. This is due to the prepared Bell states.

We will now focus on the case where i) the detectors were originally arranged as ABCD, ii) the photons are emitted at t=0, iii) the experimenter turns the crank at t=1, changing the order to ACBD, and iv) the measurement at t=2 reveals the result HVHV.

Finally, we ask the demon to tell us the status of the hidden variables at t=0 and t=1 in this postselected subset of runs. As i the TI, the hidden variables here are assigned to the detectors, not the photons.

In a nonlocal deterministic interpretation, the demon will say that at t=0 the hidden variables were arbitrary, but that by t=2 they were nonlocally steered into the right correlations. For example, it could be that at t=0, the hidden variables predicted HVHV, but then the crank rotation at t=1 turned this into HHVV. Since HHVV is not a possible outcome, then between t=1 and t=2, the nonlocal beables had to steer the B and C local beables, so the prediction returned back to HVHV.

In a superdeterministic interpretation, the demon will instead say that at t=0, the hidden variable state was always, in every run, HHVV, and it was in fact the crank rotation itself that set it to the acceptable HVHV. There is no nonlocal beable to nudge or steer or correct the hidden variables here, so they had to be fine tuned from the start, as if they already knew the future experimental procedure.

In short, superdeterminism means the hidden variables are fine tuned to always *exploit the crank* in order to deliver correct quantum correlations, seemingly knowing in advance that the crank will be turned, or being astoundingly lucky. In contrast, nonlocality means hidden variables exist to *fix the situation when the crank spoils* the correct quantum correlations.

Retrocausal approaches are sort of a compromise of these two approaches.
 
  • #677
The concept of predeterminism consists of a much larger and wider class of theories and explanations rather than just merely what is called superdeterminism; if the TI actually conforms to any other implementation of predeterminism apart from superdeterminism then this would render the point made above through the thought experiment completely obsolete.
 
  • #678
Auto-Didact said:
The concept of predeterminism consists of a much larger and wider class of theories and explanations rather than just merely what is called superdeterminism; if the TI actually conforms to any other implementation of predeterminism apart from superdeterminism then this would render the point made above through the thought experiment completely obsolete.

I wasn't making an argument, just illustrating nonlocal vs superdeterministic HVs. I've never heard of an interpretation of QM with some alternative "predeterministic" HVs, which would tell a different story than the two I sketched above. Can you give me a concrete example of an interpretation of QM with "predeterministic but not superdeterministic" HVs?
 
  • #679
A. Neumaier said:
No. The correlations are caused by the interaction - without interaction there is of course no measurement result. The beam produces a bilocal field characterized (among others) by local and bilocal beables, namely the q-expectations of ##A(x)##, ##B(y)##, ##A(x)B(y)## and ##B(y)A(x)## at spacetime positions ##x## and ##y##. When reaching the detectors at ##x## and ##y## (including the prepared controls manipulated by Alice and Bob, respectively), these interact according to the deterministic, covariant dynamics of the system beam+detector and result in correlated local measurement results at ##x## and ##y##, depending on these controls.

Thus the explanation is similar to that that you accepted as explanatory in the other interpretations:
Maybe I'm wrong but I thought the TI says that in such a bell test there is an electromagnetic beam of expectation value = 0. The detectors are basically in a metastable state where tiny random perturbations from the environment will break the symmetry and knock the detector into one of the two possible measurement results (pseudo)randomly. It seems to me that if what the TI calls measurement errors are part of the beam in the first place, then how can you really say the beables are EVs and not regular eigenstates?

Auto-Didact said:
Having read a significant portion of this thread though, it feels to me that the TI is a form of superdeterminism, wherein even what seems to be truly random (i.e. measurement outcomes as dictated by the Born rule) is actually completely a deterministic consequence of the initial condition of the universe, in conjunction with a novel 'thermalization' scheme for generating quasiprobabilities.
This is where I was going with my line of questioning but I wasn't sure and didn't want to presume anything. I understood the TI to coordinate the microstates of the detectors on each side of a bell-type experiment through either special conditions in the initial state of the universe (which I don't think most here would consider to be a satisfactory solution for this type of interpretation), or maybe through some kind of acausal or 4D constraints on the total evolution of the universe (in other words the universe "sniffs out" an acceptable path between the initial and final states, and "chooses" one that respects its physical laws).

I don't find either of these answers to be compelling when it comes to the TI, however, because the interpretation makes no mention of any additional constraints or special initial conditions, whatsoever.
 
  • Like
Likes Auto-Didact
  • #680
eloheim said:
It seems to me that if what the TI calls measurement errors are part of the beam in the first place
They are not part of the beam (which is in a stable state as long as it meets no obstacle) but results of the interaction beam+detector, which is in an unstable state and hence moves into a random one among the stable directions. The randomness appears precisely when the instability begins, and the measurement error is due to the fact that an instability magnifies tiny random fluctuations to much larger motions. Thus the errors an effect of the measurement process, and not a property of the beam.
eloheim said:
the TI to coordinate the microstates of the detectors on each side of a bell-type experiment through [...]
Through neither of these. The nonlocal dynamics depends on the local and bilocal input just prior to the beginning of the interaction, and because it is unstable it forces the combined state (and in particular the pointer q-expectation measured) into one of a few preferred pathways, just like a particle at a saddle of a 2D potential moves into one of the few valleys accessible from the saddle.
 
  • #681
Auto-Didact said:
So far, the TI still seems to be more deterministic than other physical theories are;
See the preceding post #680. What is here more determinstic than in classical mechanics?
 
Last edited:
  • #682
charters said:
I've never heard of an interpretation of QM with some alternative "predeterministic" HVs, which would tell a different story than the two I sketched above. Can you give me a concrete example of an interpretation of QM with "predeterministic but not superdeterministic" HVs?
Predeterminism is a concept much broader than QT, or even physical theory i.e. physics as a whole for that matter. Superdeterminism in the QT literature is one specific mathematical implementation of the far broader concept of predeterminism. I'm not aware of any other specific mathematical implementations that are popular or frequently referenced in modern physics literature.

Note however that this in no way implies that superdeterminism is the sole possible unique implementation of predeterminism, nor does it imply that any other qualitatively different - i.e. based in different forms of mathematics - implementations of predeterminism can not exist; to paraphrase Feynman, to presume the opposite would be wagging the dog by the tail.
A. Neumaier said:
See the preceding post #682. What is here more determinstic than in classical mechanics?
Classical mechanics doesn't depend on carefully tuned initial conditions of the universe neither does classical mechanics determine the initial condition of the universe. As you know - and I would argue probably understand better than most - this is because classical analytical mechanics is essentially purely an applied mathematical model which can be reduced to a set of reversible differential equations.

It isn't clear to me whether or not the TI depends on either of those factors (fine tuned initial conditions or determining the initial conditions); it actually isn't necessarily problematic if there is an underlying dynamical model, but in either case I would be interested to know if the TI does depend on such factors. As I said I haven't read the papers yet, I plan to do that the coming week.
A. Neumaier said:
They are not part of the beam (which is in a stable state as long as it meets no obstacle) but results of the interaction beam+detector, which is in an unstable state and hence moves into a random one among the stable directions. The randomness appears precisely when the instability begins, and the measurement error is due to the fact that an instability magnifies tiny random fluctuations to much larger motions. Thus the errors an effect of the measurement process, and not a property of the beam.
This sounds extremely similar to Penrose's proposal; in fact you merely need to replace "interaction beam+detector" with "superposed gravitational fields of the interacting system" and the two proposals are indistinguishable.
 
  • #683
Auto-Didact said:
Classical mechanics doesn't depend on carefully tuned initial conditions of the universe neither does classical mechanics determine the initial condition of the universe.
The same holds for quantum mechanics in the TI. If things are sufficiently well isolated one can consider small systems and replace everything else as in classical mechanics by a standard (but quantum) heat bath, as always done in statistical mechanics. One just needs something that carries matter or energy a macroscopic distance away from where the interaction happens, to get the required dissipation.
Auto-Didact said:
if there is an underlying dynamical model
There is, namely the Ehrenfest dynamics for q-expectations. See Section 2.1 of Part II.
Auto-Didact said:
As I said I haven't read the papers yet, I plan to do that the coming week.
Maybe you should wait with insinuating remarks until you read it.
Auto-Didact said:
This sounds extremely similar to Penrose's proposal; in fact you merely need to replace "interaction beam+detector" with "superposed gravitational fields of the interacting system" and the two proposals are indistinguishable.
Yes, there is some similarity. But
  1. the electromagnetic field is fully sufficient to achieve that;
  2. Penrose didn't propose the reality of q-expectations.
 
  • #684
Auto-Didact said:
I'm not aware of any other specific mathematical implementations that are popular or frequently referenced in modern physics literature.

Note however that this in no way implies that superdeterminism is the sole possible unique implementation of predeterminism, nor does it imply that any other qualitatively different - i.e. based in different forms of mathematics - implementations of predeterminism can not exist; to paraphrase Feynman, to presume the opposite would be wagging the dog by the tail.

If you admit you can't propose a concrete implementation of "predeterminism" distinct from the types of HV models I outlined, then what is the merit of this claim? The burden is on you to affirmatively establish the existence of this alternative before anyone needs to worry about it. Otherwise, you are just saying "well, maybe there's something nobody has ever thought of" which is just a generic exception to a huge class of claims across all topics.
 
  • #685
A. Neumaier said:
The same holds for quantum mechanics in the TI. If things are sufficiently well isolated one can consider small systems and replace everything else as in classical mechanics by a standard (but quantum) heat bath, as always done in statistical mechanics.
A. Neumaier said:
There is, the Ehrenfest dynamics for q-expectations. See Section 2.1 of Part II.
The above two statements (seem to) contradict each other: if the TI is capable of completely being reduced to - not merely highly accurately be approximated by - a reversible DE or any other similar pure object/function, then it cannot also simultaneously dynamically determine the initial conditions of the universe: orthodox QM is essentially incapable of the latter.
A. Neumaier said:
Maybe you should wait with insinuating remarks until you read it.
Maybe, but I don't want to miss out on all the fun going on here before hermetically focusing on that task.
A. Neumaier said:
Yes, there is some similarity. But
  1. the electromagnetic field is fully sufficient to achieve that;
  2. Penrose didn't propose the reality of q-expectations.
1. Yes, except that 1a) gravitational interactions cannot be shielded and 1b) all detection events are always above the one graviton level, thereby automatically making the 'EM sufficiency argument' itself insufficient as a definitive argument.

In either case, to fully definitively resolve this matter actually requires statistical analysis of the experiment across a large range of experimental parameters to make a proper scientific distinction.

2. This is true, which is exactly why the TI is rather attractive: your underlying mathematics might naturally function as a possible completion of Penrose' admittedly incomplete conceptual model! :smile:

In fact, positing the source of the instability of the states as always fundamentally resulting from the instability of gravitational superposed fields associated to quanta is not inconsistent with your proposal at all.

Moreover, if the IC of the universe are in fact dynamically determined in the TI, this more gravitationally thermalizing picture might also form a natural route to unifying the Weyl curvutare hypothesis with the TI.
 
  • #686
charters said:
If you admit you can't propose a concrete implementation of "predeterminism" distinct from the types of HV models I outlined, then what is the merit of this claim? The burden is on you to affirmatively establish the existence of this alternative before anyone needs to worry about it. Otherwise, you are just saying "well, maybe there's something nobody has ever thought of" which is just a generic exception to a huge class of claims across all topics.
I'm not proposing anything or admitting anything of the sort, I'm saying that mathematically speaking the issue is genuinely open; simply pretending that it isn't, merely because it hasn't yet is de facto a hypothesis not belonging to the methodology of fundamental physics, but instead belonging to a general strategic scientific methodology focusing more on getting simple shortterm results instead of more difficult longterm ones (i.e. fundamental matters of principle).

To propose otherwise literally requires giving an (in)existence proof for the conjecture 'superdeterminism = predeterminism': in other words, this very general discussion automatically has lead to the very specific statement for which a proof can or can not be given and this challenge may of course be handed off to interested mathematicians and logicians, making the discussion far more productive than it seems from a more naive shortterm focus, whether that was our intention or not.
 
  • #687
Auto-Didact said:
Classical mechanics doesn't depend on carefully tuned initial conditions of the universe neither does classical mechanics determine the initial condition of the universe.
A. Neumaier said:
The same holds for quantum mechanics in the TI.
Auto-Didact said:
it cannot also simultaneously dynamically determine the initial conditions of the universe
This is a misunderstanding: I never claimed that; see the quotes that I just repeated. The initial conditions are arbitrary, as in each dynamical system.
 
  • #688
A. Neumaier said:
This is a misunderstanding: I never claimed that; see the quotes that I just repeated. The initial conditions are arbitrary, as in each dynamical system.
Understood, which means that the answer to
Auto-Didact said:
It isn't clear to me whether or not the TI depends on either of those factors (fine tuned initial conditions or determining the initial conditions); it actually isn't necessarily problematic if there is an underlying dynamical model
is actually definitively negative, i.e. the IC of the universe aren't determined by Ehrenfest dynamics in the TI; alas, I got my hopes up due to post #683.
 
  • #689
Auto-Didact said:
I don't want to miss out on all the fun going on here before hermetically focusing on that task.

Since this thread is specifically about the thermal interpretation, this is the wrong attitude to take. You need to be familiar with what the interpretation says in order to discuss it. Since you have admitted you are not familiar with what the interpretation says, you are now banned from further posting in this thread.
 
  • Like
Likes weirdoguy
  • #690
akhmeteli said:
I failed to understand how the example in K/P is relevant. Could you please explain?

I understand items 1 and 2 in your quoted post. I don't understand item 3 as I don't know what Hamiltonian you have in mind. (Neither do I understand how you get your conclusion "
the dynamics introduces ... after arbitrarily short times nonzero probabilities of finding an initially locally prepared particle", but maybe this will be clearer after you explain item 3)
Could you please explain? Thank you.

Actually, the form of the Hamiltonian does not matter. See Hegerfeldt's paper
Instantaneous spreading and Einstein causality in quantum theory,
Annalen der Physik 7 (1998), 716--725.
 
  • #691
A. Neumaier said:
The official description of the thermal interpretation of quantum physics can be found in my just finished papers
A. Neumaier said:
When performing on a quantum system a measurement of an operator A with a physical meaning, one gets an approximation for its value. The thermal interpretation treats this value as an approximation not of an eigenvalue of A but of the q-expectation of A, the formal expectation value defined as the trace of the product of A with a density operator describing the state of the system. This deviation from the tradition has important theoretical implications.

To understand.

The q-expectation of A is the expectation value of the eigenvalues of A, based on an infinite number of measurements, right? Isn't it counterfactual?

1564738423688.png


Mean value of an observable in a given state :
1564738765680.png

...

/Patrick
 
  • #692
microsansfil said:
The q-expectation of A is the expectation value of the eigenvalues of A, based on an infinite number of measurements, right?
No. The q-expectation of ##A## is the number ##\langle A\rangle:=Tr~\rho A##. Neither eigenvalues nor measurements are involved in this definition.

To get an interpretation in terms of eigenvalues one has to work hard - namely up to a derivation of the spectral theorem, and then one must derive a representation this short and concise definition in terms of an integral over the spectrum. Nothing like that is needed in the thermal interpretation.

To get an interpretation in terms of measurements one needs even more, as the notion of measurement is not even mathematical. Thus one needs to invoke Born's rule. But this rule is not assumed in the thermal interpretation; it is derived there as an approximate rule, valid under the appropriate conditions.
 
  • #694
microsansfil said:
Does thermal interpretation lead to a questioning of the postulates of quantum mechanics? such as
[Eigenvalue link and Born's rule]
and replace them with more "primitive" postulates?
It replaces the two postulates mentioned by the weaker postulate that the measurement results deviate from the q-expectation by at most a small multiple of the uncertainty.

This weaker postulate is universally valid, while the traditional eigenvalue link and Born's rule are valid only for small systems with rational eigenvalues.
 
Last edited:
  • #695
This is simply not true! Standard quantum theory works very well for large systems, and eigenvalues are of course usually not rational (in which units anyway, an angular momentum is rational in units of ##\hbar## but oviously not in units of ##h=2 \pi \hbar##).

What's measured doesn't depend on the state the system is prepared in but only on the construction of the measurement device. You have to distinguish between the statistics of observables in a given quantum state and the statistics describing the accuracy of the measurement device. Of course to understand your measurement device you also need in addition the systematical errors.

So, whether you measure the "q-expectation value" in a given state or not depends on the experimental setup.

If your measurement device is constructed such that it is accurate enough, you'll always find eigenvalues of the operators describing observables. Of course then the measured value must be in the discrete spectrum of the observable operator. If you measure the spin component of a particle accurate enough, you'll alsways find one of the discrete values ##\sigma \hbar## with ##\sigma \in \{-s,\ldots,s \}##, no matter in which state the particle is prepared in. This may be the expectation value in this state or not, depending on the specific state the system is prepared in.

When measuring a continuous observable, of course you always have a finite accuracy. This accuracy again doesn't depend on the state the system is prepared in but on the construction of the measurement device.
 
  • #696
vanhees71 said:
eigenvalues are of course usually not rational (in which units anyway, an angular momentum is rational in units of ##\hbar## but obviously not in units of ##h=2 \pi \hbar##).
Of course - though recently, ##h## (not ##\hbar##) was defined to be rational!

But measurements usually produce not too high precision rational numbers. Thus the ''third postulate'' in post #693 is usually at least slightly violated in measurements of observables corresponding to operators with irrational eigenvalues.

The thermal interpretation postulates the (by the theorem in Section 2.6 of Part III) strictly weaker property mentioned in post #694, hence is more cautious than the traditional postulates that the measured value is an eigenvalue, obtained with the probability given by Born's rule. Being strictly weaker, it remains valid in all cases where the traditional postulates apply.

But the new postulate does not produce the nonsense claim that the in the early days of quantum mechanics quite inaccurate spin measurements provided as result one of the eigenvalues ##\pm\hbar/2## of the angular momentum component ##S_z##. They provided only (compared to today) crude approximations to these eigenvalues.
 
Last edited:
  • #697
Sure, it's always true that measurements are only as accurate as the measurement device are constructed to measure. Of course, in a physical theory, the corresponding statistical and systematical error analysis is not worked in. That wouldn't make much sense since that would mean you'd have to formulate a separate theory for any measurement device. This is precisely something you do not want in formulating a theory that is supposed to be generally valid. Of course, for any experiment to compare with the general theory you have to do a careful analysis of the statistical and systematic errors in order to be able to decide (objectively!) whether the result of your measurements/experiments/observations is in accordance with a generally valid theory or not.

No mater, however you interpret ##\langle A \rangle = \mathrm{Tr}(\hat{A} \hat{\rho})## within the formalism of QT, it doesn't contain the specifics of a measurement device.
 
  • #698
Hi,

I searched in vain for a peer-reviewed publication of thermal interpretation of quantum physics. There was no peer review of thermal interpretation?

/Patrick
 
  • #699
microsansfil said:
I searched in vain for a peer-reviewed publication of thermal interpretation of quantum physics. There was no peer review of thermal interpretation?
The material is too new. Peer reviewed publication takes time.
 
  • #700
microsansfil said:
Hi,

I searched in vain for a peer-reviewed publication of thermal interpretation of quantum physics. There was no peer review of thermal interpretation?

/Patrick
I'm very curious, whether this idea will ever pass peer review... Nowdays that's not too unlikely though...
 

Similar threads

Back
Top