The thermal interpretation of quantum physics

In summary: I like your summary, but I disagree with the philosophical position you take.In summary, I think Dr Neumaier has a good point - QFT may indeed be a better place for interpretations. I do not know enough of his thermal interpretation to comment on its specifics.
  • #211
A. Neumaier said:
No, you asserted that:

Yes I should have said "isn't that ..."
 
Physics news on Phys.org
  • #212
A. Neumaier said:
But even asking questions above your level of grasping things is unproductive.

Theoretically that is true, however in practice it is very hard to know exactly how much, so it is hit and miss( I do stay away from thing I don't understand fully). Although I am a EE major I think I understand much more physics than your typical undergrad who cranks problems without thinking. I have also obtained patents in mechanical engineering field, which means I have solved problems which ME should have. I am not bragging just indicating that there is nothing god given.
 
  • #213
ftr said:
Theoretically that is true, however in practice it is very hard to know exactly how much, so it is hit and miss( I do stay away from thing I don't understand fully).
The point is that while your understanding is limited you should assume that the answers you get from better informed people is reasonable, and adapt your mental picture rather than spitting out your momentary thoughts. Usually you need to do in parallel some background reading that let's you understand why the answer is meaningful. You cannot get an understanding of quantum field theory from just taking part in dialogues...
 
  • Like
Likes weirdoguy
  • #214
A. Neumaier said:
But not primarily. if this were the true goal of physics, work in physics would not be funded, and would not be of interest to society. Physics is about understanding the properties of matter and radiation to an extent that it can be used for understanding and controlling the world at large. Observations and measurements are tools to ensure that the models we form to do this are indeed adequate.

No. Only knowing the detailed state is impossible. But its existence - and an interpretation of what it means - must be assumed even to apply the methods of statistical mechanics:Thus unless quantum mechanics has intrinsic limitations it must be possible - as in Laplace's classical clockwork universe - to phrase quantum mechanics such that it applies to (and models) everything in the universe, no matter how big or complex it is.

Because it is needed nowhere in classical physics, although there the same limitations you mention apply. Having measurement in the foundations makes the foundations depend on human activities. But physical laws must also apply for physical systems never measured by humans, like distant galaxies of which we only measure very little light, or the early stages of the solar system, of which we can measure nothing but only infer information by assuming the validity of physical laws.
Well, physics (and all the other sciences) are so well funded because of this very pragmatic approach. It describes what's observed via real-world devices and it's not primary about some fundamental philosophical questions about fictions. That's the reason why the outcome of the fundamental sciences can be successfully used "to control the world" in the sense of applied sciences like engineering. That's what made the semiconductor revolution possible and that's why we can sit in front of a little box and discuss very efficiently in this forum. Quantum theory is well-enough understood to describe semiconductors to create transistors and ICs.

Concerning the "detailed state", I've still not understood what's new in your interpretation. So far, I'm only forbidden to use the very plausible statistical interpretation and the "q-expectations" as usual expectation values of the model to associate probabilities for measurement outcomes (i.e., for outcomes about the observationable macroscopic phenomena).

I'm pretty sure that also QT has limitations, although we don't know precisely where they are, and there's no clear empirical evidence for such limitations (except the fact that gravitation is not understood on the level of QT). I think, however, it's very clear what QT means: It's a formalism to associate probability distributions for measurement outcomes, and as far as it has been tested, this is a very successful, including the "apparently weird" consequences of strong correlations described by entanglement.

In classical physics, as in all physics, you need clear definitions of observables and states from the very beginning. The Theo 1 lecture on classical mechanics starts with the very definition of the observables, i.e., position, velocity, acceleration based on Euclidean physics, and time. This is of course brought in a formal setting called "Galilei-Newton spacetime", i.e., a fiber bundle, but that's just a convenient mathematical formulation of the theoretical description. Rather it is about measurable outcomes, e.g., in celestial mechanics, which turns out to be accurate enough to fly to the moon or to bring a little lab to a small body like an asteroid. Despite the very formal description, physics is always about real-world measurements and observations. Loosing contact to this operational foundations has never led to much progress in the natural sciences. In this sense Hossenfelder is right in saying that some parts of theoretical physics is "lost in math".
 
  • Like
Likes Spinnor
  • #215
A. Neumaier said:
It is not a fiction but simply something partially unknown, as anything in physics that has more than a few discrete possible values.

According to your reasoning, the state of a single photon (a general uncharged 1-particle state of QED) is also in principle unobservable, since even when assumed pure (in reality it is never pure) the state is parameterized by the solutions of the free Maxwell equations, and one can measure or prepare only a crude approximation of it.

Fact is that most of what physics models theoretically is unobservable in this sense. But we nevertheless assume that these unobservable states actually exist since only then we can talk about approximating them with the things we use to approach our knowledge about them.

The state of the universe is approachable similarly. First of all, it has a huge algebra of q-observables localized on Earth and hence susceptible to observation. Each of these observations reveals something about the state of the universe. All this combined gives us quite a good (though quite coarse) knowledge about the universe, not only on Earth but even far away - where we can infer what happens because we assume (and find that we can assume consistently) that the q-observables of the universe that refer to other regions of the universe follow the same laws as we know them from Earth. We can then use the maximum entropy principle to get an approximate state of the universe based on the q-expectations we believe to know (primarily smeared values of various effective fields) and get an approximate hydrodynamic 1PI description of the state of the universe. In this approximation it looks essentially classical except very close to the big bang and hence can be (and is) described by classical physics.

Nowhere any fiction, everywhere only the usual approximations we know from the study of all real physical systems.
Let's stick to the single-photon example, because it's well-defined and perfect to make the point we discuss. I still think "the state of the entire universe" is an empty phrase, and it's hard to discuss about such fictions on a sound basis.

What's really observed concerning photons (or more generally any state of electromagnetic fields) are events due to the interaction of the em. field with macroscopic devices, usually due to the photoelectric effect, described by QED in the most simple case as the probability for a macroscopic detector located at some macroscopically determined place to make "click". To establish that the "preparation procdure" used to provide "a single photon" really does so, one has to perform many such experiments on corresponding ensembles and apply statistical evaluation procedures to establish the corresponding probability distributios expected from theory. The naive view of the old quantum theory a la Einstein as a "little bullet" a la classical "point particles" has been flawed, and after all Planck with his view about what's really observable, namely the interaction of the field with macroscopic matter, was right!
 
  • #216
vanhees71 said:
Let's stick to the single-photon example, because it's well-defined and perfect to make the point we discuss. I still think "the state of the entire universe" is an empty phrase, and it's hard to discuss about such fictions on a sound basis.
A discussion of the single photon (which I gave in Subsections 3.4 and 3.5 of Part III) does not make at all the point I am trying to get across.

The state of the universe is indeed a fiction in the statistical interpretation (and in the Copenhagen interpretation). But in the thermal interpretation, the state of the universe is no less a fiction than it was in Laplace's classical theory ("Ie vrai systeme du monde"). It is no longer an empty phrase but a density operator on some universal Hilbert space. This universal Hilbert space carries a representation of quantum fields - those of the standard model plus gravity, in whatever form it will be made definite when we know how to model it correctly. This state contains all the information about anything we can observe in the universe, and hence it contains all of physics. It is the state modeled in some coarse-grained approximation by cosmologists.

It is the very basis upon which one must discuss what you call the real problems of quantum physics:
vanhees71 said:
The real problems are [...] the open unsolved questions of contemporary physics, which are

- a consistent quantum description of the gravitational interaction [...]

- the nature of what's dubbed "Dark Energy" and "Dark Matter"
We know of these problems (thoroughly discussed, e.g., in the book by Calcagni, Classical and quantum cosmology, Springer 2017) only because of cosmological models - for the observation of tiny systems everything is already consistent with treating gravitation as an external classical potential and ignore dark matter and dark energy. These cosmological models use at present primarily semiclassical approximations, but there can be no doubt that more accurate models should be fully quantum. See
M. Bojowald, Quantum cosmology: a review, Rep. Prog. Phys. 78 (2015), 023901.
Martin Bojowald said:
Quantum cosmology is based on the idea that quantum physics should apply to anything in nature, including the whole universe.
See also Hartle's The quantum mechanics of cosmology.
James Hartle said:
It is an inescapable inference from the physics of the last sixty years that we live in a quantum mechanical universe — a world in which the basic laws of physics conform to that framework for prediction we call quantum mechanics. If this inference is correct, then there must be a description of the universe as a whole and everything in it in quantum mechanical terms. The nature of this description and its observable consequences are the subject of quantum cosmology.[...]
The “Copenhagen” frameworks for quantum mechanics [...] are inadequate for quantum cosmology [...] these formulations characteristically assumed a possible division of the world into “observer” and “observed”, assumed that “measurements” are the primary focus of scientific statements and, in effect, posited the existence of an external “classical domain”. [...]
Measurements and observers cannot be fundamental notions in a theory that seeks to describe the early universe when neither existed.
Unlike the Copenhagen or statistical interpretation, the thermal interpretation provides a sound basis for their discussion, without the weird features of the many-worlds interpretation that needs to be invoked by Calcagni at several places (pp.188,197,391) and (in a many-histories variation called post-Everett) by Hartle (pp.5,17,89).
 
Last edited:
  • Like
Likes julcab12
  • #217
A. Neumaier said:
The point is that while your understanding is limited you should assume that the answers you get from better informed people is reasonable, and adapt your mental picture rather than spitting out your momentary thoughts. Usually you need to do in parallel some background reading that let's you understand why the answer is meaningful. You cannot get an understanding of quantum field theory from just taking part in dialogues...

I usually take what you and some others say as 90% correct, and I do google for them. Also I do go back and forth using my https://lh3.googleusercontent.com/-VDpdADUwSTM/XI-Crv4QMGI/AAAAAAAAJls/lKTuk66fX8MWus2gPGxWkQRkbL34O5SVgCL0BGAs/w663-d-h884-n-rw/20190318_141542.jpg. I make a lot of money using science, especially math, so I am a firm believer (and a practitioner) in the process.
 
  • #218
How often should I emphasize that Green's functions are not directly describing observable facts but are calculational tools enabling one to calculate them? E.g., the Wigner-transformed one-particle Green's function ((connected) two-point function) gets only a directly observable description in the sense of a phase-space distribution function after it's appropriately coarse grained over "macroscopically small microscopically sufficiently large" space-time regions. Often you can coarse-grain even further and use hydrodynamical descriptions (e.g., close to local thermal equilibrium). First I thought, your thermal interpretation goes precisely in this direction, and I'd really appreciate a mathematically better solidified argument towards it, but if it is impossible to say, what the meaning of the q-averages actually are in terms of their relation to observations, it's rather obscuring the issue than helping.

Also in physics averages are often not meant a la Gibbs, but it's already built into the measurement device. E.g., to measure the "intensity of light" in general you don't prepare many single photons but you just "shine light at a photoplate". The intensity given by the developed plate then directly gives the according macroscopic observable, which is coarse-grained (spatial through the finite resolution) and "time-summed" (accordingly renormalized time-averaged) quantity as defined already in classical electrodynamics (as the space-time averaged/integrated energy-density). This "coarse graining" is achieved automatically by the measurement device (in this case a simple photoplate or a more modern equivalent of it like a CCD cam).

In high-energy particle physics or quantum optics with single photons, on the other hand, the measurements are often in the sense of a Gibbs ensemble, i.e., you observe a given scattering process very often on equally prepared systems, e.g., protons-proton collisions at a given quite well defined cm-energy at the LHC leading to Higgs production (and the Higgs of course is observed in as many of its decay channels as possible).

Interpretation is about the meaning of the formalism concerning observations, i.e., measurements with real-world devices in the lab, no more no less. So what are your "q-expectations" if it is not allowed to use the very intuitive probabilistic interpretation for the outcome of measurement?
 
  • #219
vanhees71 said:
How often should I emphasize that Green's functions are not directly describing observable facts but are calculational tools enabling one to calculate them?
Until you realize that - as the thermal interpretation asserts - the same holds for everything in quantum physics, including all q-expectations, not only the ones in 2PI calculations. Comparison with experiment (and hence the empirically testable meaning) concerns only a minute fraction of the q-expectations manipulated, primarily:
  • those that describe cross sections or impact rates, which are experimentally determined by statistics over many events,
  • those that describe spectroscopic information (spectral lines and widths), which are experimentally determined by nonstatistical brightness measurements, and
  • those that describe smeared field expectations and linear or nonlinear response functions to small external stimuli, which are experimentally determined by a few measurements, not involving any statistics, too.
Thus the statistical interpretation is secondary, and restricted to experiments of the first group.
vanhees71 said:
So what are your "q-expectations" if it is not allowed to use the very intuitive probabilistic interpretation for the outcome of measurement?
They are the stuff that is manipulated by theory, until one ends up with one of the quantities in the above three groups and interprets them in the way appropriate for these groups.
 
  • #220
A. Neumaier said:
A discussion of the single photon (which I gave in Subsections 3.4 and 3.5 of Part III) does not make at all the point I am trying to get across.

The state of the universe is indeed a fiction in the statistical interpretation (and in the Copenhagen interpretation). But in the thermal interpretation, the state of the universe is no less a fiction than it was in Laplace's classical theory ("Ie vrai systeme du monde"). It is no longer an empty phrase but a density operator on some universal Hilbert space. This universal Hilbert space carries a representation of quantum fields - those of the standard model plus gravity, in whatever form it will be made definite when we know how to model it correctly. This state contains all the information about anything we can observe in the universe, and hence it contains all of physics. It is the state modeled in some coarse-grained approximation by cosmologists.

It is the very basis upon which one must discuss what you call the real problems of quantum physics:

We know of these problems (thoroughly discussed, e.g., in the book by Calcagni, Classical and quantum cosmology, Springer 2017) only because of cosmological models - for the observation of tiny systems everything is already consistent with treating gravitation as an external classical potential and ignore dark matter and dark energy. These cosmological models use at present primarily semiclassical approximations, but there can be no doubt that more accurate models should be fully quantum. See
M. Bojowald, Quantum cosmology: a review, Rep. Prog. Phys. 78 (2015), 023901.

Unlike the statistical interpretation, the thermal interpretation provides a sound basis for their discussion, without the weird features of the many-worlds interpretation that needs to be invoked by Calcagni at several places (p.188,197,391).
Well, already Laplace's demon was a pure fiction, even taken only the then known "universe". In the lab there are no "density operators on some universal Hilbert space" but real-world devices (in HEP something like a silicon chip, calorimeters, RICH detectors etc. etc. with some read-out electronics). It's the relation of the "density operators on some universal Hilbert space" with the corresponding observations with these real-world devices which makes an interpretation. You might say the minimal statistical interpretation is unsatisfactory for the one or other philosophical reason, but at least it gives a clear concept to make this relation between real-world measurement devices and the formalism.

It's of course a bit difficult to discuss a theory, quantum gravitation, before it's even formulated as a formalism. Let's rather look at cosmology which we really have today. First of all it's a description of "the world" which starts with a very coarse-grained picture, based on the fundamental (and in principle not empirically testable) assumption of the "cosmological principle", i.e., the assumption that on a sufficiently coarse-grained scale any place in the universe is as any other, i.e., that there's a (local) reference frame (defining "fundamental observers") no preferred place (homogeneity) or direction (isotropy). This leads to the FLRW models, a big bang etc. The observables are, naturally, local observables around the Earth like the redshift-distance relations of supernovae and the fluctuations of the CMBR. Nowhere does one come close to an understanding, what a "quantum state of the entire universe" might be.
 
  • #221
A. Neumaier said:
Until you realize that - as the thermal interpretation asserts - the same holds for everything in quantum physics, including all q-expectations, not only the ones in 2PI calculations. Comparison with experiment (and hence the empirically testable meaning) concerns only a minute fraction of the q-expectations manipulated, primarily:
  • those that describe cross sections or impact rates, which are experimentally determined by statistics over many events,
  • those that describe spectroscopic information (spectral lines and widths), which are experimentally determined by nonstatistical brightness measurements, and
  • those that describe smeared field expectations and linear or nonlinear response functions to small external stimuli, which are experimentally determined by a few measurements, not involving any statistics, too.
Thus the statistical interpretation is secondary, and restricted to experiments of the first group.

They are the stuff that is manipulated by theory, until one ends up with one of the quantities in the above three groups and interprets them in the way appropriate for these groups.
For me all the three bullets are clearly explained by the standard probabilistic interpretation of QT. Bullet 1 is indeed what comes closest to a "Gibbs-ensemble view". Bullet 2, spectroscopy (spectral lines and widths) are also generic averages/integrations of many "elementary photon-detection events". The same holds for bullet 3, where the detector response already provides enough coarse-graining/integrating/averaging over many microscopic degrees of freedom to lead to a macroscopic observable. It's not that we have to resolve all these single microscopic events, but the device is directly providing the relevant coarse-grained observables.

It's like measuring "the amount of matter" of a given substance: In everyday life we just weigh something, e.g., in terms of 12 g carbon rather than counting ##N_A## carbon atoms (although in May we'll indeed define the underlying mass unit kg, precisely by just counting particles, but that's another story).
 
  • #222
vanhees71 said:
Let's rather look at cosmology which we really have today.
It is classical; so you seem to advocate that physics at large-scales is necessarily classical. But at which size to draw the borderline? You have the standard Heisenberg cut of the Copenhagen interpretation, and move it where you think current observational limits allow it to be placed without an observable contradiction. This may be enough for the practitioner...

... but why then worry about quantum gravity or dark matter or energy? They only matter on the classical level you describe, without leaving any quantum trace in the foreseeable future!
 
Last edited:
  • #223
vanhees71 said:
It's not that we have to resolve all these single microscopic events, but the device is directly providing the relevant coarse-grained observables.
Yes. Which microscopic events? You cannot even point to them at the theoretical level, except for ideal gases, let alone measure them experimentally. Therefore - by your stated standards that what cannot be measured is fiction - these events are pure fiction, at least as much as the state of the universe. Only the q-expectation of the coarse-grained field whose q-expectation is used for comparison has an experimental (nonstatistical) meaning.

The thermal interpretation does not interpret these fictitious microevents at all but is content to interpret the final q-expectations, as the slightly uncertain values measured by some experiment.
 
Last edited:
  • Like
Likes dextercioby
  • #224
vanhees71 said:
For me all the three bullets are clearly explained by the standard probabilistic interpretation of QT
Because you allow yourself extreme liberties in what to call measurement, and don't care to be precise in your arguments. Thus for you, everything is in full order, the philosophical quibbles (which arise when looking at the details) are irrelevant, and you have no empathy for all those who have higher standards of consistency.

Even living Nobel prize winners such as Steven Weinberg (and he is not the only one) are discontent with the statistical interpretation. Weinberg has the same theoretical background as you and the same information about experimental practice. He discusses (after the Copenhagen interpretation) the statistical interpretation on pp.92-95 of his 2013 textbook Lectures on Quantum Mechanics and concludes:
Steven Weinberg said:
There is nothing absurd or inconsistent about [...] the general idea that the state vector serves only as a predictor of probabilities, not as a complete description of a physical system. Nevertheless, it would be disappointing if we had to give up the “realist” goal of finding complete descriptions of physical systems, and of using this description to derive the Born rule, rather than just assuming it. We can live with the idea that the state of a physical system is described by a vector in Hilbert space rather than by numerical values of the positions and momenta of all the particles in the system, but it is hard to live with no description of physical states at all, only an algorithm for calculating probabilities. My own conclusion (not universally shared) is that today there is no interpretation of quantum mechanics that does not have serious flaws
You cannot point to his age of 80 when publishing the book, for the whole book is written in an excellent style showing no signs of senility.

The thermal interpretation, on the other hand, presents such a realist description.
 
  • Like
Likes dextercioby
  • #225
A. Neumaier said:
content to interpret the final q-expectations, as the slightly uncertain values measured by some experiment.

What do you mean by slightly? are you saying that measuring the electron position in hydrogen atom will always give you a very close value to the expectation.
Also, you seem to deny superposition in general, is that correct?
 
  • #226
ftr said:
What do you mean by slightly? are you saying that measuring the electron position in hydrogen atom will always give you a very close value to the expectation.
It would be close to the nucleus, not significantly further away than the atom radius.
ftr said:
Also, you seem to deny superposition in general, is that correct?
No. Superpositions are pure states, hence generally idealizations, except when only very few degrees of freedom are involved. Thus they don't deserve the attention they traditionally get. Most states encountered in Nature are mixed states, and are treated as such in real experiments.
 
  • Like
Likes dextercioby
  • #227
A. Neumaier said:
not significantly further

Then would you say that a distance of two Bohr radius is extremely unlikely.
 
  • #228
ftr said:
Then would you say that a distance of two Bohr radius is extremely unlikely.
Not extremely. In any case, it would be very difficult to measure, and whatever measurement is made, it would by definition have at least this uncertainty.
 
  • #229
A. Neumaier said:
Yes. Which microscopic events? You cannot even point to them at the theoretical level, except for ideal gases, let alone measure them experimentally. Therefore - by your stated standards that what cannot be measured is fiction - these events are pure fiction, at least as much as the state of the universe. Only the q-expectation of the coarse-grained field whose q-expectation is used for comparison has an experimental (nonstatistical) meaning.

The thermal interpretation does not interpret these fictitious microevents at all but is content to interpret the final q-expectations, as the slightly uncertain values measured by some experiment.
Obviously QT is too difficult for us to come to a common understanding :-(.

Let's thus go one step back and look at light as a classical electromagnetic wave. One important "macroscopic observable" is the "intensity" we measure with a CCD camera. The "microscopic detail" is the time-dependent electromagnetic field or for our case its energy density at the CCD screen. We don't resolve the time dependence of this field with that device but directly get a time-averaged quantitiy, which we call intensity. I don't know, why you claim that the time dependence of the electromagnetic field a "pure fiction". Nevertheless at the resolution of the device and for the purpose of the description of the phenomenon it's just the intensity which is of interest to us. This does not mean that the underlying dynamics of the em. field at a finer time scale is "pure fiction".

Another example from classical electromagnetism is the notion of a DC current at finite (say room) temperature. Using an old-fashioned galvanometer whose functioning is simple mechanics of a coil in a magnetic field and a spring with the appropriate damping. It's clear that this device (on purpose!) has enough inertia to just show a contant current if you hook it to a battery and a resistor in series. On the macroscopic level, indeed the current of this setup is time-independent (for a sufficiently long time, which however is usually very short on the time-scale the galvanometer is able to resolve, after the current has been turned on). That's the "macroscopic level of description" and sufficient for almost all purposes. However, on a microscopic scale it's clear that the current is due to the motion of the thermalized conduction electrons in the wire and the resistor. This is a nearly free Fermi gas of conduction electrons at a finite temperature and thus each electron on top of its "macroscopic" drift velocity, which is given by the (thermal) average, which you can either understand as a temporal average coarse-graining over a time scale given by the typical relaxation time of the galvano meter or an average over the very many electrons in a sufficiently large volume of the wire, both leading to the same result. On a more microscopic time scale, there are of course thermal fluctuations of the electron's velocity, and this is no fiction but can be measured as shot noise with some device of better resolution.

So, even at the classical level, it depends on the measurement devices used at which resolution you look at phenomena, and fluctuations/noise is treated statistically (including the use of stochastic differential equations like the Langevin equation on top of the macroscopic "hydro-like" equations), although classical physics is completely deterministic.

For me the only difference between the classical picture, which is in contradiction with very common phenomena as the stability of the matter around us, and the quantum picture is that on a fundamental ("microscopic") level there's (within standard QT) no deterministic description, but only a probabilistic one. It may be that QT is not the final answer to the quest for ever better theoretical descriptions of nature, but so far there's no hint, how a maybe existing more comprehensive theory might look like and whether there's even the possibility for a (most probably non-local) deterministic theory (as in fact Bohmian mechanics for non-relativistic QT is) is not clear at all.

You last sentence stays nearly correct if you just exchange "thermal interpretation" with "minimal statistical interpretation". It's only clear that "microevents" are not ficitious, as far as known microscopic facts about the structure of matter is concerned, as in the classical example above: The macrocospic DC current is defined on the resolution of the galvanometer with sufficient inertia and appropriate damping to provide the coarse-grained macroscopic DC current. On a higher resolution it's well possible to measure the shot noise due to thermal fluctuations of the electrons (a la Nyquist).

I still don't see any difference between your "thermal interpretation" and the "minimal statistical interpretation", despite the fact that you substitute the clear probabilistic meaning of the standard formalism with some other concept that you have not yet made clear to me.
 
  • Like
Likes Spinnor
  • #230
A. Neumaier said:
Because you allow yourself extreme liberties in what to call measurement, and don't care to be precise in your arguments. Thus for you, everything is in full order, the philosophical quibbles (which arise when looking at the details) are irrelevant, and you have no empathy for all those who have higher standards of consistency.

Even living Nobel prize winners such as Steven Weinberg (and he is not the only one) are discontent with the statistical interpretation. Weinberg has the same theoretical background as you and the same information about experimental practice. He discusses (after the Copenhagen interpretation) the statistical interpretation on pp.92-95 of his 2013 textbook Lectures on Quantum Mechanics and concludes:

You cannot point to his age of 80 when publishing the book, for the whole book is written in an excellent style showing no signs of senility.

The thermal interpretation, on the other hand, presents such a realist description.
Well, I'm a big fan of Weinberg and particularly his textbooks. The trouble is that, as he says in the quoted paragraph, on the one hand today there's no other description of the observations than standard QT, including Born's rule as one of the independent postulates of the formalism (he carefully analyses in this very concise chapter on "interpretation" that it cannot be derived from the other postulates), but on the other hand he's obviously also no alternative theory or interpretation overcoming what he (and obviously many others) considers a problem, namely the probabilistic meaning of the state.
 
  • #231
vanhees71 said:
For me the only difference between the classical picture, which is in contradiction with very common phenomena as the stability of the matter around us, and the quantum picture is that on a fundamental ("microscopic") level there's (within standard QT) no deterministic description, but only a probabilistic one.
vanhees71 said:
he's obviously also no alternative theory or interpretation overcoming what he (and obviously many others) considers a problem, namely the probabilistic meaning of the state.
But I have one. The thermal interpretation makes quantum physics as deterministic as classical physics, and explains all probabilistic quantum effects as resulting from coarse-graining, in the same way as you did for the classical case in your previous post #229. In particular, Born's probabilistic interpretation follows, where it applies, from the deterministic rules and coarse-graining.

Thus there is no need to assume a fundamental probabilistic description. This is the essential difference to the statistical interpretation.
 
  • #232
I think what is not clear to you is similar to my problem when I ask the expectation value of "density" of what physical things. Is that correct.
 
  • #233
That's precisely, what's not clear to me. You give formal equations which are NOT interpreted nor somehow motivated from physics at all. Of course, everything is just put such that the standard interpretation at the end results, but it's already implicitly assumed but just not stated.
 
  • #234
ftr said:
I think what is not clear to you is similar to my problem when I ask the expectation value of "density" of what physical things. Is that correct.
A measurable density is the q-expectation of the 0-component of a current - an effective relativistic vector field associated to some property distributed in space.
 
  • #235
ftr said:
I think what is not clear to you is similar to my problem when I ask the expectation value of "density" of what physical things. Is that correct.
I think so. My main problem is to make sense of the word "expectation value" if there's no probability theory behind it. So far I don't see, what motivates the formal rules or makes them at least plausible as referring to physics rather than a purely axiomatic "game" of math, but that's what "interpretation" is all about, i.e., how to make sense of the formalism in its application to real-world observations (and measurements are just refined quantitative observations!).

It's also clear that physics provides never a "final answer" to the question of interpretation of a given theory (not even that any given theory is the "final answer" to the quest for ever better descriptions of what we can objectively observe about nature). "Today's signal is tomorrow's background!"
 
  • #236
A. Neumaier said:
A measurable density is the q-expectation of the 0-component of a current - an effective relativistic vector field associated to some property distributed in space.
This is a circular definition. In the standard interpretation it's clear that density is a spatially coarse-grained observable in the sense of an average over many microscopic degrees of freedom. What is this mysterious "q-expectation" if it's not such an average in the usual probabilistic sense? That's the key issue preventing a physical understanding of the proposed "thermal interpretation"!
 
  • #237
vanhees71 said:
everything is just put such that the standard interpretation at the end results, but it's already implicitly assumed but just not stated.
No. Like you did in the classical case, I nowhere assume anything probabilistic in the quantum case. Randomness appears as in classical physics by breaking metastability through effects of the deterministic noise neglected in coarse-graining. Of course it will be in agreement with the standard interpretation where the latter is based on actual measurement.
 
Last edited:
  • Like
Likes julcab12
  • #238
vanhees71 said:
I still don't see any difference between your "thermal interpretation" and the "minimal statistical interpretation", despite the fact that you substitute the clear probabilistic meaning of the standard formalism with some other concept that you have not yet made clear to me.

It seems to me that the big difference is that @A. Neumaier's interpretation does NOT make measurement the center of the interpretation. His macroscopic quantities---coarse-grained field values, and correlations and what ever else they are--are assumed to have values whether or not anybody is measuring them.

Which actually brings up an issue with the "minimal interpretation" that has occurred to me. You don't actually need the Born rule to apply to all observables. The empirical content of quantum mechanics is completely described by applying the Born rule to macroscopic, coarse-grained observables. You don't need to assume that a measurement of the z-component of spin always produces ##\pm \frac{1}{2}##. It is enough to assume that a macroscopic device such as a Stern-Gerlach device always produces a definite, macroscopically consistent state. The spot made by the electron on the photographic plate is somewhere definite, instead of being a superposition of possibilities, with probabilities given by the Born rule applied to the composite state of electron + device. Then the rule that the measurement of the spin gives ##\pm \frac{1}{2}## should be derivable, rather than assumed.
 
  • Like
Likes dextercioby and julcab12
  • #239
A. Neumaier said:
A measurable density is the q-expectation of the 0-component of a current - an effective relativistic vector field associated to some property distributed in space.
vanhees71 said:
This is a circular definition. In the standard interpretation it's clear that density is a spatially coarse-grained observable in the sense of an average over many microscopic degrees of freedom. What is this mysterious "q-expectation" if it's not such an average in the usual probabilistic sense? That's the key issue preventing a physical understanding of the proposed "thermal interpretation"!
It is the trace of the product of the q-observable ##j_0(h):=\int_\Omega h(x)j_0(x)dx## with the density operator, ##\langle j(h)\rangle:=Tr~\rho j(h)##, where ##\Omega## is the region in which the coarse-grained current is observed and ##h(x)## is an appropriate smearing function determined by the sensitivity of the measuring instrument or coarse-graining. This is mathematically well-defined; no mystery and no circularity is present (unless you impose your interpretation in addition to mine). The result can be compared with experiment in a single reading, without any statistics involved, giving an operational definition of the meaning of this q-expectation. The average taken is not over microscopic degrees of freedom but over a small spacetime region where the measurement is performed (needed to turn the distribution-valued operator current into a well-defined q-observable,
 
  • #240
vanhees71 said:
I think so. My main problem is to make sense of the word "expectation value" if there's no probability theory behind it. So far I don't see, what motivates the formal rules or makes them at least plausible as referring to physics rather than a purely axiomatic "game" of math, but that's what "interpretation" is all about, i.e., how to make sense of the formalism in its application to real-world observations (and measurements are just refined quantitative observations!).

It's also clear that physics provides never a "final answer" to the question of interpretation of a given theory (not even that any given theory is the "final answer" to the quest for ever better descriptions of what we can objectively observe about nature). "Today's signal is tomorrow's background!"

Thanks. Actually I was going to write something very similar in response. However, I had similar ideas to Arnold's TI myself and I tried very hard to convince myself but could not come to conclusion because of the said problem. So it seems people reached the probability interpretation (taking into consideration many other issues) not by choice but forced into it.
 
  • #241
It seems TI tries to solve the deterministic evolution of the wavefunction and the collapse in one go which is the holy grail. My thinking is that the right solution includes both TI and probabilistic but I don't know how to formulate it:cry:

Edit: the nice thing about TI type is that reality is once again becomes objective, which is assuring, unlike other interpretation that might give the chance for consciousness ...:eek:
 
Last edited:
  • #242
vanhees71 said:
My main problem is to make sense of the word "expectation value" if there's no probability theory behind it.
It doesn't matter how one calls it. Tradition calls it expectation value and denotes it by pointed brackets, even in situations like:
A. Neumaier said:
the terms in (3.1.34) of your lecture notes on Nonequilibrium Relativistic Quantum Many-Body Theory from August 16, 2017, where - against your minimal interpretation - you, like everyone in the field, refer to expectation values of operator products that are not Hermitian, let alone self-adjoint!
where it is very clear (and where you agree) that it cannot have this meaning:
vanhees71 said:
How often should I emphasize that Green's functions are not directly describing observable facts but are calculational tools enabling one to calculate them?
Therefore the right way is to regard all these as calculational tools, as you emphasized in this particular case. I call them q-expectations to emphasize that they are always calculational tools and not expectations values in the statistical sense, but using any other name (e.g., ''reference values'', as I did very early in the development of the thermal interpretation) would not change anything.

In some instances (often at the end of long calculations) q-expectations may refer to sample means of actual measurements, but in as many other instances, they refer to single actual measurement only. No matter which interpretation of quantum mechanics is used, the interpretation is done only at this very last stage, namely where actual measurement is involved. There, depending on the situation, one must give them a deterministic (currents, densities) or a statistical (event counts) interpretation.

Thus to account for the actual practice of quantum mechanics in all applications, one needs both deterministic measurements and statistical measurements. The thermal interpretation accounts for both, without having to start with fundamental probability.
 
Last edited:
  • #243
For me, this interpretation, though surprising at first, may be the one that best represents what physicists actually do in experiments.

I need to take a second closer look at it, and think a little bit more about it, but for the moment I see it as a perfectly valid and very interesting interpretation.
 
  • #244
akhmeteli said:
Could you please give a reference?
(concerning: "an analogous statement about a free relativistic particle somehow prepared at time t in a small region of spacetime suffers the same problem."
I have cited Hegerfeldt's https://arxiv.org/abs/quant-ph/9806036, which is general enough to include both relativistic and nonrelativistic cases because it depends only on positive energy. Reeh-Schlieder can be construed as essentially the same property for QFT. Also significant, in my view, is "Anti-Locality of Certain Lorentz-Invariant Operators", I. E. SEGAL and R. W. GOODMAN, Journal of Mathematics and Mechanics, Vol. 14, No. 4 (1965), pp. 629-638.
In another comment (also two weeks old), the locality of the retarded propagator was mentioned, but quantum field theory is mostly concerned with the propagator ##\int 2\pi\mathrm{e}^{\mathrm{i}k{\cdot}x}\delta(k{\cdot}k-m^2)\theta(k_0)\frac{\mathrm{d}k}{(2\pi)^4}## (noting the restriction to positive frequency, ##\theta(k_0)##, which puts it in the frame for Hegerfeldt's result), and its time-ordered variant, the Feynman propagator, both of which are nonlocal in that they are nonzero at space-like separation.
 
  • Like
Likes dextercioby
  • #245
From what I can gather as a lay person after a quick glance and mostly guessing what it is about, superficially, the thermal interpretation appears like a redressing of the ensemble interpretation.

Measurement seems to be redefined like an ensemble of measurements, so you are no longer measuring single out comes, but instead the expectation value, and so terms such as deterministic, realistic, objective and local are no longer describing the same thing as in most other interpretations. (I’m not sure this is a positive feature given all the confusion that already exists over these terms)

I like to think of Bells Theorem proving that QM must be non-local when assuming objectively unique measurement results, full stop, however, TI seems to evade this by redefining what a measurement is and what is being measured. (Again, I’m not sure this is a positive feature given the amount of confusion people seem to have over the consequences of Bells Theorem.)

Personally, would reject accepting TI on its own terms, and I would say at least, it is not objective or deterministic, and that it is silent on locality vs objectively unique measurement results, and that it may be non-complete by not being able to explain single measurement results fully.

However, as I said at the beginning, I haven’t read the source material closely so could be missing a lot, although, so far TI helps me understand is why I feel uncomfortable with the ensemble interpretation by looking at it from a different point of view, which I am also uncomfortable with.
 

Similar threads

  • Quantum Interpretations and Foundations
Replies
1
Views
205
  • Quantum Interpretations and Foundations
Replies
24
Views
3K
  • Quantum Interpretations and Foundations
2
Replies
42
Views
5K
  • Quantum Interpretations and Foundations
Replies
2
Views
779
  • Quantum Interpretations and Foundations
Replies
25
Views
1K
  • Quantum Interpretations and Foundations
Replies
1
Views
1K
  • Quantum Interpretations and Foundations
Replies
17
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
48
Views
4K
  • Quantum Interpretations and Foundations
11
Replies
376
Views
10K
  • Quantum Interpretations and Foundations
Replies
7
Views
705
Back
Top