Neumaier Thermal Interpretation of QM, valid?

In summary: The thermal interpretation resolves the long-standing paradox of the seemingly contradictory results of experiments on the properties of single particles, which seem to show that the particles are in a superposition of several mutually exclusive states, and of the wave-particle duality, which insists that particles are both waves and particles. In fact, the thermal interpretation of quantum mechanics shows that the wave-particle duality is an artifact of our poor understanding of how measurements are performed. All measurements are in fact statistical, and therefore always produce a result that is a weighted average of the results of multiple trials. The wave-particle duality is a result of our ignorance of the fact that these weighted averages represent the
  • #1
rodsika
279
2
For those of us trapped in a corner with difficult choices of whether to believe many worlds are splitted off billions of times every second or whether there is Godlike power to collapse the wave function in the universe or whether waves can travel forward or backward in time (Transactional) or wave function that is instantaneous from end to end of the entire universe (Bohmian), etc. Neumaire Thermal Interpretation of Quantum Mechanics may offer us peace of mind and contentment that the mystery of quantum mechanics is solved. But the question is, is Neumaire Thermal Interpretation valid or tally with all experimental facts? If you have detected any conflict with experiments that can falsify his model, pls share it. If it's valid, maybe someone can put it at wikipedia. Neumaire can you pls create a more layman friendly introduction to it like describing in detail how it explains buckyball made up of 430 atoms that can still interfere with itself? I can't understand the vague abd incomplete explanation you put forth in your paper. Thanks.

Arnold Neumaire said:

"
I have my own interpretation.
I call it the the thermal interpretation since it agrees with how one does measurements in thermodynamics (the macroscopic part of QM (derived via statistical mechanics), and therefore explains naturally the classical properties of our quantum world. It is outlined in my slides at http://arnold-neumaier.at/ms/optslides.pdf and the entry ''Foundations independent of measurements'' of Chapter A4 of my theoretical physics FAQ at http://arnold-neumaier.at/physfaq/physics-faq.html#found0 . It is described in detail in Chapter 7 of my book ''Classical and Quantum Mechanics via Lie algebras'' at http://lanl.arxiv.org/abs/0810.1019 . See also the following PF posts:
https://www.physicsforums.com/showthread.php?p=3187039&highlight=thermal#post3187039
https://www.physicsforums.com/showthread.php?p=3193747&highlight=thermal#post3193747


The thermal interpretation
It is superior to any I found in the literature, since it
-- acknowledges that there is only one world,
-- is observer-independent and hence free from subjective elements,
-- satisfies the principles of locality and Poincare invariance, as defined in relativistic quantum field theory,
-- is by design compatible with the classical ontology of ordinary thermodynamics
-- has no split between classical and quantum mechanics,
-- applies both to single quantum objects (like a quantum dot, the sun or the universe) and to statistical ensembles,
-- allows to derive Born's rule in the limit of a perfect von-Neumann measurement (the only case where Born's rule has empirical content),
-- has no collapse (except approximately in non-isolated subsystems).
-- uses no concepts beyond what is taught in every quantum mechanics course,
No other interpretation combines these merits.

The thermal interpretation leads to a gain in clarity of thought, which results in saving a lot of time otherwise spent in the contemplation of meaningless or irrelevant aspects arising in poor interpretations.


The thermal interpretation is based on the observation that quantum mechanics does much more than predict probabilities for the possible results of experiments done by Alice and Bob. In particular, it quantitatively predicts the whole of classical thermodynamics.

For example, it is used to predict the color of molecules, their response to external electromagnetic fields, the behavior of material made of these molecules under changes of pressure or temperature, the production of energy from nuclear reactions, the behavior of transistors in the chips on which your computer runs, and a lot more.

The thermal interpretation therefore takes as its ontological basis the states occurring in the statistical mechanics for describing thermodynamics (Gibbs states) rather than the pure states figuring in a quantum mechanics built on top of the concept of a wave function. This has the advantage that the complete state of a system completely and deterministically determines the complete state of every subsystem - a basic requirement that a sound, observer-independent interpretation of quantum mechanics should satisfy.

The axioms for the formal core of quantum mechanics are those specified in the entry ''Postulates for the formal core of quantum mechanics'' of Chapter A4 of my theoretical physics FAQ at http://arnold-neumaier.at/physfaq/physics-faq.html#postulates . There only the minimal statistical interpretation agreed by everyone is discussed. The thermal interpretation goes far beyond that, assigning states and an interpretation for them to individual quantum systems, in a way that large quantum systems are naturally described by essentially classical observables (without the need to invoke decoherence or collapse). The new approach is consistent with assigning a well-defined (though largely unknown) state to the whole universe, whose properties account for everythng observable within this universe.

The fundamental mathematical description of reality is taken to be standard quantum field theory. It doesn't matter for the thermal interpretation whether or not there is a deeper underlying deterministic level.


In my thermal interpretation of quantum physics, the directly observable (and hence obviously ''real'') features of a macroscopic system are the expectation values of the most important fields Phi(x,t) at position x and time t, as they are described by statistical thermodynamics. If it were not so, thermodynamics would not provide the good macroscopic description it does.

However, the expectation values have only a limited accuracy; as discovered by Heisenberg, quantum mechanics predicts its own uncertainty. This means that <Phi(x)> is objectively real only to an accuracy of order 1/sqrt(V) where V is the volume occupied by the mesoscopic cell containing x, assumed to be homogeneous and in local equilibrium. This is the standard assumption for deriving from first principles hydrodynamical equations and the like. It means that the interpretation of a field gets more fuzzy as one decreases the size of the coarse graining - until at some point the local equilibrium hypothesis is no longer valid.

This defines the surface ontology of the thermal interpretation. There is also a deeper ontology concerning the reality of inferred entities - the thermal interpretation declares as real but not directly observable any expectation <A(x,t)> of operators with a space-time dependence that satisfy Poincare invariance and causal commutation relations.
These are distributions that produce measurable numbers when integrated over sufficiently smooth localized test functions.


Deterministic chaos is an emergent feature of the thermal interpretation of quantum mechanics, obtained in a suitable approximation. Approximating a multiparticle system in a semiclassical way (mean field theory or a little beyond) gives an approximate deterministic system governing the dynamics of these expectations. This system is highly chaotic at high resolution. This chaoticity seems enough to enforce the probabilistic nature of the measurement apparatus. Neither an underlying exact deterministic dynamics nor an explicit dynamical collapse needs to be postulated.

The same system can be studied at different levels of resolution. When we model a dynamical system classically at high enough resolution, it must be modeled stochastically since the quantum uncertainties must be taken into account. But at a lower resolution, one can often neglect the stochastic part and the system becomes deterministic. If it were not so, we could not use any deterministic model at all in physics but we often do, with excellent success.

This also holds when the resulting deterministic system is chaotic. Indeed, all deterministic chaotic systems studied in practice are approximate only, because of quantum mechanics. If it were not so, we could not use any chaotic model at all in physics but we often do, with excellent success.[/QUOTE]
 
Last edited by a moderator:
Physics news on Phys.org
  • #2


One major idea being put forth by Neumaire is that fields are now primary and particles are just momentum of the fields. So particle concept is now outdated and there is no sense of thinking of the double slit experiment as particle that moves in between the emitter and detector but more like field interacting in between (perhaps like Feynman interaction vortexes). If that is true, then the buckyball composing of 430 atoms can be considered as field too.. but does it make sense to think in terms of 430 atom buckyball field??
 
  • #3


A Neumaier is clearly a very erudite and impressive scholar, but his interpretation is unlikely to be correct as modern experiments urge a fundamental probabilistic character to Nature which his interpretation is not in agreement with. I doubt he can explain all the results of Zeilinger, Aspect et al in a very coherent manner.

He is a genius clinging to old school deterministic ideas about nature a la Einstein, but no amount of obfuscating the microscopic nature of reality will make its fundamental probabilistic nature go away. And also I'm not sure he should be allowed to promote such a non peer reviewed philosophy so strongly on these forums. His upcoming book has excellent sections on lie groups and their applications, but the interpretation stuff is really not scientific, and his excellence in many science and mathematical areas should not be confused with correct understanding of the deep implications of quantum theory.
 
Last edited:
  • #4


unusualname said:
A Neumaier is clearly a very erudite and impressive scholar, but his interpretation is unlikely to be correct as modern experiments urge a fundamental probabilistic character to Nature which his interpretation is not in agreement with. I doubt he can explain all the results of Zeilinger, Aspect et al in a very coherent manner.

He is a genius clinging to old school deterministic ideas about nature a la Einstein, but no amount of obfuscating the microscopic nature of reality will make its fundamental probabilistic nature go away. And also I'm not sure he should be allowed to promote such a non peer reviewed philosophy so strongly on these forums. His upcoming book has excellent sections on lie groups and their applications, but the interpretation stuff is really not scientific, and his excellence in many science and mathematical areas should not be confused with correct understanding of the deep implications of quantum theory.

No, the probababilistic character to nature is not necessarily fundamental. Gerald t' Hooft, one of the leading theoretical physicists today has written a paper called "Determinism beneath Quantum Mechanics in

http://arxiv.org/abs/quant-ph/0212095

which says:

"Contrary to common belief, it is not difficult to construct deterministic models where stochastic behavior is correctly described by quantum mechanical amplitudes, in precise accordance with the Copenhagen-Bohr-Bohm doctrine"

So Neumaire approach where randomness is not fundamental is not reputed.

What makes Neumaire approach possibly valid is that it needs no new assumptions but by just being updated... it looks like all those interpretations used the old concept of particles. We know that field is primary and particles just momentum of it. So if we look the double slit experiment with new updated point of view, perhaps it explains everything? Or is there subtle differences that can refute Neumaier idea? Again don't go to the reason being random probability is fundamental because one of the leading theoretical physicists has given reasons it is not and determinism may be more fundamental. See the famous paper above.

In this very drastic hours where the public are being convinced Many Worlds may be the logical option left. Neumaire approach means going back to sanity and I think it's valid.

Arnold, can you please write a Wikipedia article about it and give the basics as your articles are so vague and disorganized. The thermal approach may let us go back to sanity in this crazy world.
 
  • #5


unusualname said:
A Neumaier is clearly a very erudite and impressive scholar, but his interpretation is unlikely to be correct as modern experiments urge a fundamental probabilistic character to Nature which his interpretation is not in agreement with.

From this I get the feeling you haven't closely studied his book in detail? I've found it difficult to see how his interpretation is fundamentally different from the better known, and well-regarded, minimal statistical interpretation, though phrased in different language and extended so that connections with thermodynamics are clarified.

Anyway, the thing about interpretations is that if you can't find an experiment that distinguishes between them quantitatively, then the only guiding principle left is Occam's razor.


And also I'm not sure he should be allowed to promote such a non peer reviewed philosophy so strongly on these forums.

I understand that Arnold is in the process of seeking a thread to be opened in the
Independent Research forum, but perhaps that will take a while to pass moderation
given the size of the book. Let us remain patient and polite until then.

rodsika said:
...Neumaire...

His surname is spelled "Neumaier".

BTW, (rodsika), while waiting, you might be interested to read Ballentine's articles on the statistical interpretation of QM (for which I given references earlier) if you haven't already done so. I've found it helpful to be aware of this simpler (but related) perspective when reading Arnold's book.
 
Last edited:
  • #6


Neumaier gives example of beam of photon. But a beam of photon has real wave, whereas electron wave is just probability wave. So one can't argue whether before observation, a beam of photon is there or not. It has real wave. Whereas matter waves are not actual waves. There goes my first attempt to refute his interpretation.

I have a question. Feynman mentions in his book "The Strange Story of Light and Matter" about reflections of light. He said that in reflections in a glass, 4% of the photons are always deflected. How do the photons know how to be 4% Feynman asks. What I want to know is, can reflections be done with matter wave too like electron wave such that you can also see 4% of electron being deflected?
 
  • #7


A. Neumaier said:
The thermal interpretation
It is superior to any I found in the literature, since it
-- acknowledges that there is only one world,
-- is observer-independent and hence free from subjective elements,
-- satisfies the principles of locality and Poincare invariance, as defined in relativistic quantum field theory
I'd be curious to know more about that last one (related to my conversation with A. Neumaier on another thread), if he or someone else with a decent understanding of this type of interpretation like strangerep would be willing to elaborate on it. As mentioned on p. 3 of this paper, the specific sense in which quantum field theory "satisfies the principles of locality and Poincare invariance" is as follows:
The dynamical variables of the theory are field operators defined at each point in space, whose dynamical evolution is described by local (Lorentz-invariant, in the relativistic case) differential equations.
I haven't studied QFT, but would the field operators at a given point in spacetime give the probabilities or expectation values for the outcome of some measurement at that point? Neumaier goes on to say that expectation values are the basic elements of his interpretation:
In my thermal interpretation of quantum physics, the directly observable (and hence obviously ''real'') features of a macroscopic system are the expectation values of the most important fields Phi(x,t) at position x and time t, as they are described by statistical thermodynamics. If it were not so, thermodynamics would not provide the good macroscopic description it does.
But if he wants to avoid collapse, how does he go from expectation values to actual measured values of microscopic systems? (or macroscopic 'pointer states' associated with those measurements) Certainly we can make measurements where the outcome differs from one trial to another, and these different outcomes cannot be determined from the expectation values alone. And assuming the interpretation contains some elements corresponding to definite observed outcomes in particular regions of spacetime, then according to Bell's theorem the value of these elements cannot behave in a purely local way, it seems to me.
 
  • #8


JesseM said:
I haven't studied QFT, but would the field operators at a given point in spacetime give the probabilities or expectation values for the outcome of some measurement at that point?

The field operators (and (integrals over) powers thereof) determine all the quantities which can (in principle) be measured. One takes vacuum expectation values of these quantities to produce ordinary numbers. It's a bit hard to describe helpfully if you haven't studied QFT. (I'd move that up a bit higher on the priority list. :-)

Neumaier goes on to say that expectation values are the basic elements of his interpretation [...]
But if he wants to avoid collapse, how does he go from expectation values to actual measured values of microscopic systems? (or macroscopic 'pointer states' associated with those measurements) Certainly we can make measurements where the outcome differs from one trial to another, and these different outcomes cannot be determined from the expectation values alone.

The central point of a minimal statistical interpretation is that statistics (i.e., probability distributions over a set of outcomes) is all we have.

Enlarge your picture of "expectation values" to include variance and higher moments of a probability distribution. E.g., for a quantity "A", its mean in a given state is of the form [tex]\bar{A} = \langle A \rangle[/tex], while its variance is of the form [tex]var(A) = \langle (A - \bar{A})^2\rangle[/tex]. I.e., the quantity [tex](A - \bar{A})^2[/tex] is just as valid an element of the algebra of observable quantities as A, for the class of system being considered.

If var(A) is very small, we can think of the system in that state as having an "almost" definite value <A> corresponding to the quantity A.
 
  • #9


strangerep said:
The central point of a minimal statistical interpretation is that statistics (i.e., probability distributions over a set of outcomes) is all we have.
But Neumaier says that his interpretation "acknowledges that there is only one world", and that it "is consistent with assigning a well-defined (though largely unknown) state to the whole universe", shouldn't that mean the interpretation has to give more than just a collection of probabilities for different states at different points in spacetime, since in our "one world" we see "states" consisting of definite outcomes rather than just probabilities? I'm not saying that the interpretation has to give a deterministic formula for saying which outcome actually occurs, it would be fine if there was just some rule like "once you have the probability distributions throughout spacetime, then at each point randomly select one as the actual outcome using the probability distribution for that point in spacetime". But if the random choice at each point was made independently using only the probability distribution at that point, you wouldn't get the correct correlations between entangled particles. In order to select a single set of definite outcomes from the local probability distributions, you'd need some sort of nonlocal procedure, even if the probability distributions themselves obeyed locality in some sense.
strangerep said:
If var(A) is very small, we can think of the system in that state as having an "almost" definite value <A> corresponding to the quantity A.
But there are plenty of cases where var(A) would be large even for macroscopic systems, like the state of macroscopic "pointers" which show the results of experiments on quantum particles, no? So again if the the interpretation wants to account for what we see in the real world it seems to me it needs some procedure for "choosing" definite values (even if the procedure involves an element of randomness), not just giving us probability distributions.
 
  • #10


rodsika said:
No, the probababilistic character to nature is not necessarily fundamental. Gerald t' Hooft, one of the leading theoretical physicists today has written a paper called "Determinism beneath Quantum Mechanics in

http://arxiv.org/abs/quant-ph/0212095

which says:

"Contrary to common belief, it is not difficult to construct deterministic models where stochastic behavior is correctly described by quantum mechanical amplitudes, in precise accordance with the Copenhagen-Bohr-Bohm doctrine"

So Neumaire approach where randomness is not fundamental is not reputed.

It's refuted by maintstream peer-reviewed interpretations of modern experiments. 't Hooft's ideas are another (crackpot) deterministic attempt that has no mainstream acceptance.

strangerep said:
From this I get the feeling you haven't closely studied his book in detail? I've found it difficult to see how his interpretation is fundamentally different from the better known, and well-regarded, minimal statistical interpretation, though phrased in different language and extended so that connections with thermodynamics are clarified.

Anyway, the thing about interpretations is that if you can't find an experiment that distinguishes between them quantitatively, then the only guiding principle left is Occam's razor

Well, the minimal statistical interpretation isn't correct is it, or at least, at best it is a vacuous interpretation. Similarly, Neumaier obfuscates when explaining microsopic reality, in fact he doesn't explain it, which is probably because you can't with pure deterministic ideas.

And I have looked through the draft of his book, maybe the final version will clarify some ideas.
 
  • #11


Do you guys agree that the Ensemble Interpretation (a requirement for Neumaier Interpretation) is already falsified? It is said at the bottom of http://en.wikipedia.org/wiki/Ensemble_Interpretation that

"However, hopes for turning quantum mechanics back into a classical theory were dashed. Gribbin continues:
"There are many difficulties with the idea, but the killer blow was struck when individual quantum entities such as photons were observed behaving in experiments in line with the quantum wave function description. The Ensemble interpretation is now only of historical interest."[11]""

I presume that the Ensemble Interpretation is the same as the Statistical Interpretation? Both these can't handle single system. But Neumaier Interpretation (actually not an Interpretation but just a QFT way of looking at it or from a QFT point of view) can handle single system. Why is that Neumaier's can handle single system while the Ensemble and Statistical can't since they are identical? What are the differences?
If we can refute Neumaier. Then we can eliminate all statistical interpretations and maybe focus and accept in Many Worlds or Bohmian.
 
  • #12


JesseM said:
in our "one world" we see "states" consisting of
definite outcomes rather than just probabilities?

No. States do not consist of "definite outcomes". Although one might
like to think of individual events in experiments as definite outcomes,
all experiments involve some level of statistical analysis. No one
trusts an isolated event as being reliable, without many more similar
events to make it statistically significant.

In an experiment, we arrange for an interaction between an object
system in some state S and an apparatus (whose initial state A is
initially uncorrelated with S) results in a new state A' of the
apparatus for which there's a correlation between A' and S, if the
experiment is performed enough times. If the variance of such apparatus
post-states A' is very small, we interpret S as a deterministic
(definite) state for the observable quantity represented by the
apparatus.

it would be fine if there was just some rule like "once you
have the probability distributions throughout spacetime, then at each
point randomly select one as the actual outcome using the probability
distribution for that point in spacetime"

That's not really how probability distributions work. We use them to
calculate expectation values, variances, co-variances, correlations,
etc, interpreted over an ensemble.


But there are plenty of cases where var(A) would be large even for macroscopic
systems, [...] ?
Yes. In that case we'd say that the macroscopic system doesn't have a
"definite" or "deterministic" value of the observable quantity
represented by A. Nevertheless, we can still report something useful in
the form of an observed probability distribution.

(BTW, I prefer the "deterministic" over "definite" since it conveys a more
useful meaning.)


So again if the the interpretation wants to account for what we see in the
real world it seems to me it needs some procedure for "choosing" definite
values (even if the procedure involves an element of randomness), not just
giving us probability distributions.

The statistical nature of all scientific experiments in physics (as distinct from
informal anecdotal stories) suggests the contrary.
 
  • #13


unusualname said:
Well, the minimal statistical interpretation isn't correct is it,

For which experiments does (QM + minimal statistical interpretation) predict
incorrect results? References?

or at least, at best it is a vacuous interpretation.

That's a very different statement from saying that it's "incorrect".

(Personally, I find minimizing interpretational baggage as much as possible
to be quite attractive.)
 
  • #14


strangerep said:
No. States do not consist of "definite outcomes". Although one might
like to think of individual events in experiments as definite outcomes,
all experiments involve some level of statistical analysis.
I think you're talking about statistical analysis used in coming up with values of variables for the quantum system itself, but I was talking about the macroscopic "pointer state", like the number that appears on a computer monitor after it runs its statistical analysis program (or the numbers representing raw data before analysis, which may not directly correspond to any quantum observable). That's an element of the physical world too, one which we can directly observe, if Neumaier's interpretation only gives probability distributions for such macro-states rather than definite values, then I would say it isn't a full model of the "one world" we find ourselves in. Again, I'm not requiring that a full model allow such states to be predicted in a deterministic way, it'd be fine if it had a stochastic element which randomly picks one macrostate based on the probability distribution, but as I said this element would have to be a nonlocal one.

Think of it this way: suppose you want to build a simulated universe running on a computer (or collection of computers, see below), and the simulation is supposed to model all the types of macrostates we can directly observe (while it doesn't need to have any model of microstates which we only infer based on macrostates). The model need not predict the results of particular trials of any real-world experiment, but we should be able to create a model of the same type of experiment on our computer(s), with the simulation yielding a series of macroscopic pointer states whose overall statistics should match the results of analogous experiments performed in the real world. If we require that the simulation be a "local" one, then we could imagine a bunch of computers which were each responsible for simulating a small element of space, and on each time-increment the computer should give an output based only on inputs from other computer outputs that lie within its past light cone (this is assuming the laws of physics can be approximated arbitrarily well be a simulation with discrete "pixels" of space and time; if not, you could imagine replacing the finite array of computers with a perfectly continuous array of "functions" at each point in space, which continuously produce outputs at each instant of time based only on inputs from points in their past light cone). And the computers can have stochastic random number generators built in, so if part of their output consisted of a probability distribution, they could also use that probability distribution to randomly select one specific output based on that distribution.

If observable macrostates in a region of space at a particular time are just a function of all the computers' outputs in that region at that time (outputs which may be thought of as "microstates" for specific points in space), then the point here is that no "local" simulation of this type, where the computers have no access to inputs outside their past light cone when generating outputs, can ever give a pattern of macrostates consistent with QM. Even if computers at each point can generate probability distributions in a local way, a stochastic rule for generating specific outcomes based on these probability distributions would have to operate nonlocally, with computers representing points at a spacelike separation coordinate their random choices to make sure they created the correct entanglement correlations. This is just a natural consequence of Bell's theorem. So, I think it's misleading to call Neumaier's interpretion a "local" one, it either fails to model the fact that we see particular outcomes for macroscopic pointer states (which all other interpretations attempt to account for) rather than just probability distributions, or if the model is made to include a stochastic rule for generating a series of particular macrostates, then the rule must operate in a nonlocal fashion.
 
Last edited:
  • #15


rogerl said:
Do you guys agree that the Ensemble Interpretation [...] is already falsified?

Speaking for myself: No.

It is said at the bottom of http://en.wikipedia.org/wiki/Ensemble_Interpretation that

"However, hopes for turning quantum mechanics back into a classical theory were dashed. Gribbin continues:
"There are many difficulties with the idea, but the killer blow was struck when individual quantum entities such as photons were observed behaving in experiments in line with the quantum wave function description. The Ensemble interpretation is now only of historical interest."[11]""

You seem to be getting "classical" and "statistical" mixed up. Quantum theory is
not classical, and the kind of probability that occurs in quantum theory is not exactly
the same as classical -- because noncommuting quantities cause difficulty with the
probability axiom concerning "A and B" types of events. But that's a different issue.

BTW, that part of the Wikipedia page quoting Gribbin is not supported by peer-reviewed
references, but only a link to Gribbin's Wiki page mentioning his early training in
astrophysics, and his career as a science writer.

To say that "the Ensemble interpretation is now only of historical interest" is inaccurate.
 
  • #16


JesseM said:
I think it's misleading to call Neumaier's interpretion a "local" one,

I'll leave that one for Arnold to answer in due course.
 
  • #17


strangerep said:
For which experiments does (QM + minimal statistical interpretation) predict
incorrect results? References?

Non-interactive quantum zeno experiments for example. At least that what I think Omnes mentioned in his 1992 Rev Mod Phys article on Interpretations of QM:

R Omnes - 'Consistent Interpretations of Quantum Mechanics' Rev Mod Phy Vol 64 No 2 1992 p339-383
http://rmp.aps.org/abstract/RMP/v64/i2/p339_1 (http://puhep1.princeton.edu/~mcdonald/examples/QM/omnes_rmp_64_339_92.pdf [Broken])

However, I can't find the relevant paragraph, so I might be thinking of another paper. But in any case, the minimal statistical interpretation says so little beyond what the basic mathematical shut up and calculate method says that I'm not sure it can be regarded as an interpretation at all.
That's a very different statement from saying that it's "incorrect".

(Personally, I find minimizing interpretational baggage as much as possible
to be quite attractive.)

me too, the final correct interpretation should by minimal set of ideas to correctly describle a linear probabilistically evolving universe, with a mathematical equation that exactly describes microscopic reality
 
Last edited by a moderator:
  • #18


I'm concerned about one aspect of the "statistical interpretation". It appears that most people take it to be defined by Ballentine's 1970 article, which I haven't read myself. The reason why I'm concerned is that the quotes I've seen seem to contradict Bell's theorem. This is from section 4.4 of "Ensemble interpretations of quantum mechanics. A modern perspective", by Home and Whitaker. PDF.

Also on p. 361 of ref. [3], he says, “the Statistical Interpretation considers a particle to always be at some position in space, each position being realized with relative frequency [itex]|\psi(\mathbf{r})|^2[/itex] in an ensemble of similarly prepared experiments”. Later [3, p. 379] he states, “there is no conflict with quantum theory in thinking of a particle as having definite (but, in general, unknown) values of both position and momentum”.​

Reference [3] is of course Ballentine's article. What does this have to do with Bell? I don't have a rigorous argument, because I don't even know if there are Bell inequalities for position and momentum like the ones we've all seen for spin component operators, so I can only argue by analogy. What Ballentine said in 1970 about position and momentum can definitely not be said about spin components, because that statement would lead directly to a Bell inequality called the CHSH inequality (see pages 215, 216 in Isham's book for a derivation), which is violated by QM and by nature. This makes me believe that Ballentine's statements about position and momentum can't be correct either.

I suspect that Ballentine didn't know Bell's theorem (which was published in 1964) when he wrote the article in 1970, and that if he had, he wouldn't have said those things.

unusualname said:
But in any case, the minimal statistical interpretation says so little beyond what the basic mathematical shut up and calculate method says that I'm not sure it can be regarded as an interpretation at all.
The way I see it, the only thing that really differs between ensemble/statistical, "shut up and calculate", and a many-worlds interpretation that isn't crippled because someone insisted on removing the Born rule, is how they answer the question of whether QM describes what "actually happens" to the system at all times (even between state preparation and measurement).

Ensemble/Statistical: It doesn't.
MWI: It does.
"Shut up and calculate": I don't care.

Since the question can't be answered by experiment, "shut up and calculate" is the only one of these "interpretations" that doesn't say anything unscientific. (If anyone is wondering how I define "describes" and "actually happens" in this context, the answer is that I don't. I consider them primitives, not terms that need to be defined. I don't think they can be defined in a way that improves on the situation we have if we take them as primitives).
 
Last edited:
  • #19


unusualname said:
Non-interactive quantum zeno experiments for example. At least that what I think Omnes mentioned in his 1992 Rev Mod Phys article on Interpretations of QM:

R Omnes - 'Consistent Interpretations of Quantum Mechanics' Rev Mod Phy Vol 64 No 2 1992 p339-383
http://rmp.aps.org/abstract/RMP/v64/i2/p339_1 (http://puhep1.princeton.edu/~mcdonald/examples/QM/omnes_rmp_64_339_92.pdf [Broken])

However, I can't find the relevant paragraph, so I might be thinking of another paper.

Ballentine published this paper:

Comment on ‘‘Quantum Zeno effect’’, Phys. Rev. A 43, 5165–5167 (1991)
Ballentine (abstract) said:
The quantum Zeno effect is not a general characteristic of continuous measurements. In a recently reported experiment [Itano et al., Phys. Rev. A 41, 2295 (1990)], the inhibition of atomic excitation and deexcitation is not due to any ‘‘collapse of the wave function,’’ but instead is caused by a very strong perturbation due to the optical pulses and the coupling to the radiation field. The experiment should not be cited as providing empirical evidence in favor of the notion of ‘‘wave-function collapse.’’

It has 60 citations, but I haven't followed them through to see what if any
counter-comments are advanced.

But in any case, the minimal statistical interpretation says so little beyond what the basic mathematical shut up and calculate method says that I'm not sure it can be regarded as an interpretation at all.
It makes a connection between the maths and experiments, which is enough
to satisfy me.

unusualname said:
the final correct interpretation should by minimal set of ideas to correctly describe a linear probabilistically evolving universe, with a mathematical equation that exactly describes microscopic reality

"exactly describes microscopic reality" sounds like a different theory rather than
a mere interpretation.
 
Last edited by a moderator:
  • #20


Fredrik said:
I suspect that Ballentine didn't know Bell's theorem (which was published in 1964) when he wrote the article in 1970, and that if he had, he wouldn't have said those things.
It might be wise to read Ballentine's 1970 paper (which does indeed have a section
on Bell's theorem), also the chapter in his textbook on the same subject, before
forming a firm view such things.
 
  • #21


strangerep said:
Ballentine published this paper:

Comment on ‘‘Quantum Zeno effect’’, Phys. Rev. A 43, 5165–5167 (1991)


It has 60 citations, but I haven't followed them through to see what if any
counter-comments are advanced.


It makes a connection between the maths and experiments, which is enough
to satisfy me.



"exactly describes microscopic reality" sounds like a different theory rather than
a mere interpretation.

I'm not sure how Ballentine's thinking has developed with the huge number of sophisticated experimental results in the last 20 years, but perhaps it is possible to make the ensemble interpretation consistent with everything so far discovered, since it doesn't say much beyond the basic mathematical model of QM. But it's terribly dull ;-) .

The point is, that the really correct interpretation will probably naturally explain microscopic reality, even if it has to be a fundamentally probabilistic equation. I'm not sure why you think that's a different theory, do you really think the correct (and simplest) theory of QG will still rely on a vague "interpretation"? Well, maybe you'll be correct, but I hope it is not like this.
 
  • #22
strangerep said:
It might be wise to read Ballentine's 1970 paper (which does indeed have a section
on Bell's theorem), also the chapter in his textbook on the same subject, before
forming a firm view such things.
OK, that proves that my guess about why he said those things was wrong. I have downloaded the 1970 article now. PDF. So far, I have only had time for a very quick look at the section on Bell's theorem. It doesn't seem to address my concern. I might read some more later, but I'm not sure I can really take the time to do that now.

I haven't read all the relevant parts of his book, but I've read some of it. He certainly didn't say anything similar to the statements that bother me now in the parts I've read. (I would definitely have remembered that). But he might have said it some part I haven't read, so I can't rule it out.

The statements that bother me say that it's possible that a particle has a well-defined position and a well-defined momentum at all times. It would imply that as far as position and momentum are concerned, QM probabilities are really just ignorance probabilities. Since this is definitely not the case with spin components, I find it very hard to believe that it can be true for position and momentum.

Aren't there any Bell inequalities for position and momentum that can rule that out? (This question is for everyone, not just Strangerep).
 
Last edited:
  • #23


JesseM said:
<SNIP>

So, I think it's misleading to call Neumaier's interpretion a "local" one,<SNIP>

Jesse, I note that Arnold hasn't yet responded here.

Maybe he did not recognize his misspelt name in the THREAD TITLE, etc?

It should be Neumaier.

PS: If it's not fixed, searches may miss it. Can you fix it?

Cheers, GW
 
  • #24


Gordon Watson said:
PS: If it's not fixed, searches may miss it. Can you fix it?

Cheers, GW
Unfortunately no, only the moderators ("mentors") can change thread titles (being a "science advisor" doesn't mean any mod-like powers like editing other people's posts), for this forum it would have to be Doc Al or ZapperZ.
 
  • #25


JesseM said:
Unfortunately no, only the moderators ("mentors") can change thread titles (being a "science advisor" doesn't mean any mod-like powers like editing other people's posts), for this forum it would have to be Doc Al or ZapperZ.
Moderator's Note: Thread Title has been corrected :wink:
 
  • #26


Gordon Watson said:
Jesse, I note that Arnold hasn't yet responded here.
I am currently traveling, hence responses will be a bit sporadic. Moreover, there is no a special thread in the forum Independent research devoted to the discussion of the thermal interpretation (and other stuff in my book): https://www.physicsforums.com/showthread.php?t=490492
I'll answer there the issues brought up here.
 

1. What is the Neumaier Thermal Interpretation of QM?

The Neumaier Thermal Interpretation of QM is a theory proposed by Austrian physicist and mathematician Gerhard Neumaier. It is a modification of the traditional Copenhagen Interpretation of quantum mechanics, which aims to provide a more intuitive understanding of quantum phenomena in terms of classical thermodynamics and statistical mechanics.

2. How does the Neumaier Thermal Interpretation explain quantum phenomena?

The Neumaier Thermal Interpretation explains quantum phenomena in terms of the thermal equilibrium of a system. It suggests that quantum systems are in a state of thermal equilibrium, and the uncertainty and randomness observed in quantum measurements are a result of the thermal fluctuations of the system.

3. Is the Neumaier Thermal Interpretation of QM valid?

The validity of the Neumaier Thermal Interpretation of QM is still a topic of debate among physicists. While some argue that it provides a more intuitive understanding of quantum mechanics, others criticize it for lacking empirical evidence and being inconsistent with some fundamental principles of quantum mechanics.

4. How does the Neumaier Thermal Interpretation differ from the Copenhagen Interpretation?

The Neumaier Thermal Interpretation differs from the Copenhagen Interpretation in that it replaces the concept of wave function collapse with thermal equilibrium. It also suggests that the randomness and uncertainty observed in quantum measurements are a result of thermal fluctuations, rather than inherent properties of particles.

5. What are some potential implications of the Neumaier Thermal Interpretation of QM?

If the Neumaier Thermal Interpretation is proven to be valid, it could have significant implications for our understanding of quantum mechanics and its applications. It could potentially lead to the development of new technologies and further advancements in fields such as quantum computing and communication.

Similar threads

  • Quantum Interpretations and Foundations
Replies
2
Views
612
  • Quantum Interpretations and Foundations
Replies
5
Views
1K
  • Quantum Interpretations and Foundations
4
Replies
109
Views
7K
  • Quantum Interpretations and Foundations
3
Replies
84
Views
1K
  • Quantum Interpretations and Foundations
Replies
24
Views
3K
  • Quantum Interpretations and Foundations
3
Replies
76
Views
3K
  • Quantum Interpretations and Foundations
Replies
5
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
42
Views
5K
  • Quantum Interpretations and Foundations
11
Replies
370
Views
9K
  • Quantum Interpretations and Foundations
Replies
1
Views
1K
Back
Top