I Are there signs that any Quantum Interpretation can be proved or disproved?

  • #101
gentzen said:
If we write the dispersion relation as ##\omega^2/c^2=k_x^2+k_y^2+k_z^2## and assume that ##k_x## and ##k_y## are real, then we see that ##k_z^2## will get negative if ##\omega^2/c^2<k_x^2+k_y^2##. If ##k_z^2## is negative then ##k_z## is imaginary, which corresponds to an evanescent wave.

At a horizontal planar interface (perpendicular to the z-axis) between two homogeneous regions, ##k_x## and ##k_y## cannot change, because they describe the modulation of the electromagnetic field along the interface. So you can have an optical wave in a glass substrate with well defined ##k_x## and ##k_y## based on the direction of the wave. If the direction of the wave is sufficiently gracing with respect to a horizontal planar interface to vacuum, then it will become evanescent in the vacuum below the interface.
(The wave will quickly (exponentially) vanish with respect to increasing distance from the interface. Additionally, the time average of the z-component of the Poynting vector is zero, i.e. there is no energy transported in z-direction on average by the evanscent wave in vacuum.)
In the vacuum ##\vec{k}## is a real vector. That's precisely what I don't understand!

If there is a planar interface between two homogeneous regions, there's no vacuum of course, and then there are evanescent waves (aka total reflection).
 
Physics news on Phys.org
  • #102
vanhees71 said:
"nothing is definite in the statistical interpretation", but that's no bug but a feature
... that leaves unexplained why we see definite things in our world.
vanhees71 said:
That a single point on the screen is blackened for each particle registered is first of all an empirical fact. It is also well understood quantum mechanically as already shown as early as 1929 in Mott's famous paper about α-particle tracks in a cloud chamber.
He didn't show this or claim to have shown it. Mott leaves unexplained why there is a first definite ionization in the first place. He explains only that subsequent ionizations are approximately along a straight line. Thus he explains the tracks assuming the first definite ionization happened somehow.
vanhees71 said:
I believe that your thermal interpretation is the answer as soon as you allow your q-expectation values to be interpreted in the standard probabilistic way
Many q-expectations can be interpreted in the standard probabilistic way, namely all cases where an ensemble of many essentially equally prepared systems is measured. The latter is the assumption on which the statistical interpretation rests. Thus whenever the statistical interpretation applies it is fully compatible with the thermal interpretation.

But there are many instances (in particular most macroscopic measurements) where the statistical interpretation cannot apply since only a single measurement is taken. In these cases the statistical interpretation has no explanatory power at all, while the thermal interpretation still applies.
vanhees71 said:
of course you cannot describe the macroscopic observables by microscopic dynamics, because it is their very nature to be only a coarse-grained description of the relevant macroscopic degrees of freedom
Your ''of course you cannot'' is a fallacy. Nothing forbids that a coarse-grained description is not fully determined by the underlying microscopic reality.

All our physical knowledge suggests the contrary. In many cases we have two description levels amenable to complete mathematical analysis, of which one is a coarse-grained version of the other. In all these cases, the latter turned out to be a well-determined approximation of the former, with rigorously established conditions for the validity of the approximation.

I expect that in the context of the thermal interpretation, the analysis of the quantum measurement process along the lines of Breuer & Pettrucione and Allahverdian, Balian & Nieuwenhuizen will, as outlined in my book, sooner or later reach the same status.

vanhees71 said:
But macroscopic properties are statistical averages over many microscopic degrees of freedom.
Only for an ideal gas. For real matter they are integrals of complicated expressions without any statistics in them. You cannot get the measurable free energy of a substance by averaging microscopic free energies.
vanhees71 said:
A single measurement, no matter whether you measure "macroscopic" or "microscopic" properties, never establishes a value, let alone, can test any theoretical prediction, as one learns in the first session of the introductory beginner's lab!
Ask an engineer or a medical doctor, and he will tell you the contrary. Only highly volatile quantities (such as pointer readings of a macroscopically oscillating pointer or measurements of a single spin) need multiple measurements.
 
  • #103
We see definite things in our world, because we simply don't look in too much detail. Our senses take averages over very many microscopic "events" all the time. The same "mechanisms" apply to all kinds of other "measurement devices". You don't need to prepare many Gibbs ensembles of a gas in a container to observe thermal equilibrium. Gibbs ensembles often are just a useful "Gedankenexperiment" to derive statistical predictions. When putting a thermometer in it to measure its temperature the "averaging" is done dynamically by many hits of the gas molecules with the thermometer and after some time with overwhelming probability thermal equilibrium establishes and thus we can "read off a temperature" on a calibrated scale. Looking in more detail we can of course, also in thermal equilibrium, observe (thermal) fluctuations, which was the reason why finally in the beginning of the 20th century the statistical physics approach and the discovery of the atomistic nature of matter got accepted by the physics community.

Of course, sometimes we indeed have to work with "Gibbs ensembles", as, e.g., in particle (and heavy ion) physics. It's not too surprising that also there statistical physics works very well, sometimes even equilibrium statistical physics. Of course, this is partially just due to looking at sufficiently coarse-grained observables, e.g., just the particle abundancies in a heavy-ion collision averaging over very many events amazingly accurately are described by (grand) canonical ensembles (reaching from the most abundant species like pions to very rare ones like light nuclei and antinuclei).

Also I didn't claim that the macroscopic behavior is not described by the underlying microscopic dynamics. To the contrary it is derivable from it by quantum statistics. The only point is that it describes the macroscopic coarse-grained observables, relevant for the description at the level of accuracy of the observed phenomena, and not the microscopic irrelevant details. The latter can become relevant at higher resolution of the observables, and then you have to change the level of description to describe these more detailed then becoming "relevant" observables. That's all understandable within the paradigm of the statistical approach. It's only blurred by your thermal interpretation, if you forbid to interpret the expectation values in the usual statisical/proabilistic way.
 
  • Like
Likes physicsworks and WernerQH
  • #104
I think "thermal interpretation" is a misnomer. How can you call it an interpretation if it carefully avoids the question what quantum theory is about. It is more like an empty shell, big enough to contain theories as diverse as quantum mechanics and thermodynamics.
 
  • Skeptical
Likes PeroK
  • #105
vanhees71 said:
Our senses take averages over very many microscopic "events" all the time.
This is a hypothesis without sufficient basis.

Our senses are physical objects. Thus they don't do mathematical operations of averaging. Instead they behave according to physical laws governed by quantum theory. So whatever they do must be deducable from quantum mechanics. But in quantum mechanics there is no notion of microscopic "events" happening in time - unless you define these in quantum mechanical terms, which you never did.

vanhees71 said:
Gibbs ensembles often are just a useful "Gedankenexperiment" to derive statistical predictions.
But Nature doesn't care about our gedanken but operates on the basis of dynamical laws. Without an actual ensemble no actual averaging, hence no actual statistics, hence Born's rule does not apply.

vanhees71 said:
the "averaging" is done dynamically by many hits of the gas molecules
This assumes that the gas molecules are classical objects that can hit the thermometer. But in standard quantum mechanics all you have is potentialities, nothing actually happening, except in a measurement.


vanhees71 said:
Also I didn't claim that the macroscopic behavior is not described by the underlying microscopic dynamics. To the contrary it is derivable from it by quantum statistics.
Then you need to derive from quantum statistics that, upon interacting with a particle to be detected, the detector is not - as the Schrödinger equation predicts - in a superposition of macroscopic states with pointer positions distributed according to Born's rule, but that it is in a macroscopic state where one of the pointer positions is actually realized, so that we can see it with our senses.

Since you claim that this is derivable, please show me a derivation! If valid, it would solve the measurement problem!
 
Last edited:
  • Like
Likes gentzen
  • #106
WernerQH said:
I think "thermal interpretation" is a misnomer. How can you call it an interpretation if it carefully avoids the question what quantum theory is about.
If you look at my book (or the earlier papers), you'll see that it does not avoid at all the question what quantum theory is about, but makes many statements about it that are different from the standard interpretations.
WernerQH said:
It is more like an empty shell, big enough to contain theories as diverse as quantum mechanics and thermodynamics.
Thus it is not an empty shell but a full shell - able to interpret both.
 
  • Like
Likes PeroK and gentzen
  • #107
A. Neumaier said:
This is not a fact but your assumption. No known fact contradicts the possibility of a a "pure quantum" description of the world; in contrast, there is no sign at all that a classical description must be used in addition.
Except the failure of delivering such a description. (I don't accept MWI as being an consistent interpretation. At least I have never seen a consistent version.) Nonexisting theories cannot be discussed. And therefore they cannot count.
A. Neumaier said:
... it postulates a classical world in addition to the quantum world. How the two can coexist is unexplained.
Yes. That's the problem. Which does not exist in realistic interpretations.
A. Neumaier said:
It does not exist for quantum field theory, which is needed for explaining much of our world!
It exists for QFT. All you need for this is to use the fields to define the configuration space.

Bohm.D., Hiley, B.J., Kaloyerou, P.N. (1987). An ontological basis for the quantum theory, Phys. Reports 144(6), 321-375
 
  • #108
Sunil said:
Except the failure of delivering such a description. (I don't accept MWI as being an consistent interpretation. At least I have never seen a consistent version.) Nonexisting theories cannot be discussed. And therefore they cannot count.
MWI is not the only "pure quantum" description, the thermal interpretation too has no additional non-unitary dynamics. But since you are obviously unaware of it, it would probably not help if I tried to discuss it with you. So let me try something else instead.

The Bose-Einstein statistics was derived and published in 1924 before the invention of the Born rule (and other elements of the Copenhagen interpretation). So I guess that its derivation does not depend on any specific interpretation of quantum mechanics. And I guess the same is true for derivations of other thermal states. For a quantum experiment in a laboratory, instead of trying to prepare a special pure state, it can also make sense to just prepare a "nice" thermal state. Now state preparation and measurement are closely related. So for the measurement too, it can make sense to just measure some nice smooth thermodynamical property, instead of going for some discrete property exhibiting quantum randomness. And again, it is quite possible that the derivation of such smooth thermodynamical properties does not depend on any specific interpretation of quantum mechanics like MWI, Copenhagen, or de Broglie-Bohm.
 
  • #109
gentzen said:
MWI is not the only "pure quantum" description, the thermal interpretation too has no additional non-unitary dynamics. But since you are obviously unaware of it, it would probably not help if I tried to discuss it with you. So let me try something else instead.

The Bose-Einstein statistics was derived and published in 1924 before the invention of the Born rule (and other elements of the Copenhagen interpretation). So I guess that its derivation does not depend on any specific interpretation of quantum mechanics. And I guess the same is true for derivations of other thermal states. For a quantum experiment in a laboratory, instead of trying to prepare a special pure state, it can also make sense to just prepare a "nice" thermal state. Now state preparation and measurement are closely related. So for the measurement too, it can make sense to just measure some nice smooth thermodynamical property, instead of going for some discrete property exhibiting quantum randomness. And again, it is quite possible that the derivation of such smooth thermodynamical properties does not depend on any specific interpretation of quantum mechanics like MWI, Copenhagen, or de Broglie-Bohm.
There are many approaches who aim to present a "pure quantum" description, but IMHO they all fail to deliver. (I specially mentioned MWI only because I have a quite special opinion about MWI, namely that it is not even a well-defined interpretation, the phrase "credo quia absurdum" seems to be about MWI believers.) Roughly, these are all interpretations which attempt to solve the measurement problem without really solving it. (In realistic interpretations it does not exist, given that they have a trajectory already on the fundamental level.)

I don't get the point of your consideration. Of course, one can use thermodynamics in state preparation as well as in measurements.
 
  • Haha
  • Skeptical
Likes PeroK and atyy
  • #110
A. Neumaier said:
No known fact contradicts the possibility of a a "pure quantum" description of the world; in contrast, there is no sign at all that a classical description must be used in addition. Once the latter is assumed, one must show how to define the classical in terms of the more comprehensive quantum. This is the measurement problem. You simply talk it away by making this assumption.
The "sign" I see is that degree to which the the quantum framework is well defined, is directly dependent on the degree which the reference (classical background) is not limiting in terms of information capacity and processing power. The laws of the quantum dynamics are inferred depdend on an inference machinery living in the classical domain (which in the ideal picture is unlimited).

Once we relax the firmness of this reference, and inference machinery, we likely need to acqknowledge that some of the deduce power of QM formalism are invalid, and we need to reconstruct our "measurement theory" based on a non-classical reference.

So if we by "pure quantum" means the quantum formalism including hamiltonians and laws - as inferred relative classical background - but what remains when one discards the "classical baggage", that really makes no sense to me. To me it's mathematical extrapolation that I think is unlikely to be the right way to remove the classical background. If it works, I agree it would be easier, so it makes sense to entertain the idea but conceptually I think it's flawed.

/Fredrik
 
  • #111
Sunil said:
There are many approaches who aim to present a "pure quantum" description
Do you mean neo-Copenhagen non-representationalist approaches like QBism or RQM? Or approaches like consistent histories that remain agnostic on realism (or representation) and have no problems with being incomplete (at least as far as Roland Omnès or Robert Griffiths are concerned)? Or the minimal statistical interpretation?
MWI and the thermal interpretation are different from those in being realist (i.e. representationalist) approaches that aim for completeness. I am not aware of other candidates in this group that maintain a "pure quantum" description without additional non-unitary dynamics.

Sunil said:
I don't get the point of your consideration. Of course, one can use thermodynamics in state preparation as well as in measurements.
The point is that you don't need the projection postulate (and probably not even the Born rule in any form whatsoever) to derive the statistics of the thermal state or prepare a system in it. To prepare a system in it, you probably just have to control the thermodynamical degrees of freedom and keep them constant long enough for the system to settle/converge sufficiently to the thermal state. So you can get away with the much weaker assumptions that the thermodynamical degrees of freedom can be controlled and measured. At least I guess that this assumption is weaker than assuming the existence of a classical description plus some version of the Born rule.
 
  • Like
Likes A. Neumaier
  • #112
gentzen said:
Do you mean neo-Copenhagen non-representationalist approaches like QBism or RQM? Or approaches like consistent histories that remain agnostic on realism (or representation) and have no problems with being incomplete (at least as far as Roland Omnès or Robert Griffiths are concerned)? Or the minimal statistical interpretation?
Yes. One should exclude "interpretations" which are in fact different theories (GRW and so on) which explicitly introduce collapse dynamics. And all the realistic interpretations which explicitly introduce a configuration space trajectory. Then there is Copenhagen and similar interpretations which do not get rid of the classical part. Everything else can be roughly classified as attempts to construct a pure quantum interpretation.

Interpretations accepting being incomplete are an interesting question. On the one hand, this removes the most serious fault of Copenhagen. Then, one can ask if there is a measurement problem at all if one accepts that the theory is incomplete. That would mean that the Schrödinger evolution would be approximate. If so, what would be the problem with a collapse? It would be simply the place where the subquantum effects reach the quantum level.

MWI and the thermal interpretation are different from those in being realist (i.e. representationalist) approaches that aim for completeness. I am not aware of other candidates in this group that maintain a "pure quantum" description without additional non-unitary dynamics.
gentzen said:
The point is that you don't need the projection postulate (and probably not even the Born rule in any form whatsoever) to derive the statistics of the thermal state or prepare a system in it. To prepare a system in it, you probably just have to control the thermodynamical degrees of freedom and keep them constant long enough for the system to settle/converge sufficiently to the thermal state.
Hm, does this method allow to cover all preparation procedures? Plausibly yes, if one applies the same argument as in dBB theory for the non-configuration observables. Whatever we can measure, the result can be identified from the classical measurement device by looking only at its configuration too. Looking only at the thermodynamical degrees of freedom of the measurement device would be sufficient too.

But what about preparation with devices which shoot single particles from time to time? Say, a piece of radioactive material? To prepare something in a stable state is one thing - the source itself can be described as being in a stable state. But states of fast moving particles being prepared with "keep them long enough" does not sound very plausible.

The other point is that thermodynamics is also incomplete description of something else. (Especially in the Bayesian approach.) And this something else can become visible. Say, the early states of the universe may have been quite close to equilibrium, but with changing temperature this become unstable and where we live today appeared out of some fluctuation. We could not have seen that place being different from others looking at the thermodynamic variables of the early universe.
 
  • #113
Sunil said:
But what about preparation with devices which shoot single particles from time to time? Say, a piece of radioactive material? To prepare something in a stable state is one thing - the source itself can be described as being in a stable state. But states of fast moving particles being prepared with "keep them long enough" does not sound very plausible.
I basically agree that there are many experiments that are not covered by preparation in the thermal state or by measurement of thermal degrees of freedom. I try to elaborate a bit below, why I did bring up the thermal state nevertheless.

Sunil said:
Hm, does this method allow to cover all preparation procedures?
Probably not, but it covers more preparation procedures than the MWI mindset typically acknowledges. Even if an experiment is prepared in a thermal state, typical quantum interference effects still occur at interfaces or surfaces, because locally the state can look much more pure than it looks from a less local perspective.
It doesn't cover all preparation procedures from my instrumentalistic point of view. And I guess also not from the point of view of the thermal interpretation, but for different reasons.

If I have to simulate the interaction of an electron beam with a sample, then for me (as instrumentalist) what counts as prepared is the sample (in a thermal state) and the beam (in a rather pure state) incident on the sample. Independent of whether the electron source was in a thermal state, there was some filtering going on (with associated randomness - quantum or not) in the preparation of the beam. For me, this filtering is part of the preparation procedure, and it probably depends on interpretative assumptions. And in any case, the combined system of sample and beam does not seem to be in a thermal state.

Even more extreme, error correction schemes in (superconducting) quantum computing require measuring some qubits and then modifying the unitary dynamics based on results of that measurement. I can't believe that measurements of thermodynamical degrees of freedom are good enough for that. After all, the quantum randomness is crucial here for that correction scheme to work.

(That quantum computing example is part of the reason why I bring up the thermal state. I have the impression that when Stephen Wolfram and Jonathan Gorard conclude (1:25:08.0 SW: in the transcipt) "..., it’s still a little bit messy, but it seems that you can never win with quantum computing, ...", then they are slighlty too much in the MWI mindset, which cares more about the supposedly pure state of the universe as a whole than about the mostly thermal states locally here on earth.)
 
  • #114
vanhees71 said:
In the vacuum ##\vec{k}## is a real vector. That's precisely what I don't understand!

If there is a planar interface between two homogeneous regions, there's no vacuum of course, and then there are evanescent waves (aka total reflection).
The tricky part is probably that when I try to measure the evanescent wave, then I will bring some kind of detector (for example the photoresist) close to the planar interface. And then all that is left from the vacuum is a very thin layer, thin enough such that evanescent wave did not vanish yet at the detector surface. A (huge) part of the evanescent wave will be reflected by that surface, and the time average of the z-component of the Poynting vector of the superposition of the reflected part with the "initial" evanescent wave is no longer zero. So you no longer have perfect total reflection, if you (successfully) measure the evanescent wave.

Maybe it helps you to imagine the very thin layer of vacuum between the planar interface and the detector surface as some sort of waveguide.
 
  • Like
Likes A. Neumaier
  • #115
It's not vacuum if you there is a planar interface! There are no evanescent waves in the vacuum.
 
  • #116
vanhees71 said:
It's not vacuum if you there is a planar interface! There are no evanescent waves in the vacuum.
Yes, far away from interfaces, there are no evanescent waves in vacuum. Just to clarify that your remark about the planar interface only applies to evanescent waves. It is still valid to talk about the optical waves in vacuum, even so there is a planar interface. The reason is that they will still be there far away from the interface, where it is fine to say that there is vacuum.
 
  • Like
Likes vanhees71
  • #117
With the advent of QFT which successfully unites the quantum and classical scale, these so called interpretations are no longer needed. They are a relic of the 40's , 50's and 60's when QM was the sole actor on the stage.

The world as described by the successor of QM(QFT) is composed entirely of fields and the everyday stuff like chairs and tables are secondary manifestations(particular state of the fields). This is the comprehensive picture that unites the scales. It might be at odds with your preconceived notions but Nature doesn't care. The world is what it is.
These so-called interpretations also make a lot of sacrifices and introduce various kinds of weirdness, so it's not totally unexpected that some prejudices need to go. The world of the senses is real but is not fundamental. I believe this is what stunned Neils Bohr when he said "If you are not shocked, you have not understood it yet"
 
  • Like
Likes vanhees71
  • #118
EPR said:
With the advent of QFT which successfully unites the quantum and classical scale, these so called interpretations are no longer needed. They are a relic of the 40's , 50's and 60's when QM was the sole actor on the stage.
Would this mean that the so-called measurement problem is mere imagination? If not, does QFT resolve it?
 
  • #119
timmdeeg said:
Would this mean that the so-called measurement problem is mere imagination? If not, does QFT resolve it?
No. It's less obvious but is still there. In QFT 'particles' are created and annihilated at a particular location.
 
  • #120
gentzen said:
The Bose-Einstein statistics was derived and published in 1924 before the invention of the Born rule (and other elements of the Copenhagen interpretation). So I guess that its derivation does not depend on any specific interpretation of quantum mechanics.
It’s not clear to me that Bose-Einstein statistics are particularly “quantum”. Their derivation is the same as Maxwell-Boltzmann statistics except the assumption of indistinguishability (in counting the number of states).
 
  • #121
EPR said:
In QFT 'particles' are created and annihilated at a particular location.
Exactly. Only the locations are real. What "travels" between them (particles or waves) are figments of our classical imagination. The field quanta are identical, which means the locations can be connected in different, but indistinguishable ways. All Feynman diagrams contribute and have to be summed over.
 
  • Skeptical
Likes weirdoguy
  • #122
stevendaryl said:
It’s not clear to me that Bose-Einstein statistics are particularly “quantum”.
That is a bit unfair, I already went from Planck's less "quantum" derivation in 1900 to Bose's in 1924, for this tiny bit of additional legitimacy. And what about the thermal state? In a sense, the possibility to define what is meant by "constant" (in time) gives significance to the energy eigenstates here. (In general relativity, it gets hard to define what is mean by "constant".) So you get an operational meaning from a symmetry. I find this very quantum.

Many energy eigenstate computations that students will do in their first course on quantum mechanics won't need more for being interpreted in the sense of providing predictions about the real world. (In a subsequent answer, I already stated that this does not cover all practically relevant preparation and measurement procedures.)
 
  • #123
stevendaryl said:
It’s not clear to me that Bose-Einstein statistics are particularly “quantum”. Their derivation is the same as Maxwell-Boltzmann statistics except the assumption of indistinguishability (in counting the number of states).

I probably should amend that, The most convenient way to "count states" is to start with discrete energy levels, and count the number of ways those states can fill up with particles. So I guess quantum mechanics is implied by starting with energy levels.
 
  • Like
Likes gentzen
  • #124
The way to "count states" is what's particularly quantum for both fermions and bosons. One of the great puzzles solved by QT is the resolution of Gibbs's paradox. Before the discovery of quantum statistics by Bose and Einstein or Pauli, Jordan, and Dirac it was solved by Boltzmann with one of the many ad-hoc adjustments of the classical theory.
 
  • Informative
Likes gentzen
  • #125
@Demystifier - is Bohmian Mechanics consistent with the standard lore of quantum statistical mechanics and identical particles? In classical mechanics, there are no identical particles since particles have distinct trajectories at all times, so the resolution of the Gibbs paradox is a fudge. It is often said that quantum mechanics gives the true resolution of the Gibbs paradox, since quantum particles don't have trajectories and can truly be swapped. But in BM, particles have trajectories, so does it mean that the proper derivation of the Gibbs factor is also a fudge in QM?
 
  • Like
Likes gentzen, Demystifier and vanhees71
  • #126
atyy said:
@Demystifier - is Bohmian Mechanics consistent with the standard lore of quantum statistical mechanics and identical particles? In classical mechanics, there are no identical particles since particles have distinct trajectories at all times, so the resolution of the Gibbs paradox is a fudge. It is often said that quantum mechanics gives the true resolution of the Gibbs paradox, since quantum particles don't have trajectories and can truly be swapped. But in BM, particles have trajectories, so does it mean that the proper derivation of the Gibbs factor is also a fudge in QM?
The answer to the last question is - no. To understand it, one must first fix the language. In the Bohmian language, the particle is, by definition, an object with well defined position ##x##. So a wave function ##\psi(x)## is not a particle, it is a wave function that guides the particle. So when in quantum statistical mechanics we say that "particles cannot be distinguished", what it really means is that the wave function has certain symmetry (or anti-symmetry). In other words, quantum statistical mechanics is not a statistical mechanics of particles; it is a statistical mechanics or wave functions. This means that all standard quantum statistical mechanics is correct in Bohmian mechanics too, with the only caveat that one has to be careful with the language: standard quantum statistical mechanics is not a statistics of particles, but a statistics of wave functions. In particular, the Gibbs factor in standard quantum statistical mechanics is perfectly correct and well justified, provided that you have in mind that this factor counts wave functions, not particles.
 
  • Informative
Likes gentzen and atyy
  • #127
vanhees71 said:
Also I didn't claim that the macroscopic behavior is not described by the underlying microscopic dynamics. To the contrary it is derivable from it by quantum statistics.
A. Neumaier said:
Then you need to derive from quantum statistics that, upon interacting with a particle to be detected, the detector is not - as the Schrödinger equation predicts - in a superposition of macroscopic states with pointer positions distributed according to Born's rule, but that it is in a macroscopic state where one of the pointer positions is actually realized, so that we can see it with our senses.

Since you claim that this is derivable, please show me a derivation! If valid, it would solve the measurement problem!
This last quote misses the point. Because the detector entangled with the microscopic system in question is a macroscopic object consisting of typically more than ##N=10^{20}## atoms and because our observations are based on its collective coordinates, i.e. quantities averaged over macroscopic numbers of order ##10^{20}##, it turns out that quantum interference effects between two (approximately) classical states of these collective coordinates are suppressed by the double exponential factor of ##e^{-10^{20}}##, and are, even in principle, unobservable. This double exponent is so astonishingly small, that its inverse, viewed as a time interval, doesn't even need a unit of time to be attached to it, because it is basically the same HUMONGOUS number, no matter if one measures it in Planck units or in ages of the Universe. In addition, since macroscopic detectors are almost impossible to completely isolate from the environment, especially on such gigantic time scales, the usual environmental decoherence makes the above overlap between states even tinier.

Thus, in dealing with such collective coordinates, we may employ the usual rules of classical probability, if we are willing to make an error of order less than ##e^{-10^{20}}##. In particular, we may employ the rules of conditional probability and, after monitoring individual histories of collective coordinates (corresponding to specific positions of a needle on the dial of our macroscopic detector), apply Bayes’ rule to throw out the window the portion of the probability distribution that was not consistent with our observation of the collective coordinate, and to rescale the rest of the distribution. This is essentially what "collapse" is all about.

All of the above is nicely and pedagogically explained in the new QM book by Tom Banks (a great addendum to Ballentine's book, in my opinion).
 
Last edited:
  • Like
Likes PeroK and vanhees71
  • #128
In terms of interpretation being a misnomer for interpretations which make new predictions, is there a formal definition of what a prediction in this sense is? Do they need to be analytical predictions? Or could they be predictions which are only derivable through simulation, and how about recursively enumerable but not co-recursively enumerable predictions (sets of predictions which cannot be analytically determined to be predictions, and are only determinable to be predictions if they are predictions)?

In other words, in some cases, maybe it is not knowable whether an interpretation is an interpretation, and whether it is provable testable, until/unless it is successfully proven tested. If that were the case, then we could be in store for a never ending quest to determine if an interpretation adds predictions on top of QM, or if it is supported by observation is correct, and in the meantime, nobody could ever rule the question in hand to be a matter of philosophy rather than science.
 
Last edited:
  • #129
physicsworks said:
in dealing with such collective coordinates, we may employ the usual rules of classical probability
This is common practice, but not a derivation from first principle. Nobody doubts that quantum mechanics works in practice, the question is whether this working can be derived from a purely microscopic description of the detectors (plus measured system and environment) and unitary quantum mechanics.

I'd be very surprised if such a derivation is in the book by Banks. Can you point to the relevant pages?
 
  • #130
A. Neumaier said:
This is common practice, but not a derivation from first principle.
It may be helpful to step back and have a closer look at what those "first" principles are. Otherwise you may be setting yourself an unattainable goal.
 
  • #131
WernerQH said:
you may be setting yourself an unattainable goal.
It is a cheap way out to declare a goal that is not yet achieved to be unattainable.

Progress is made by making the unreachable reachable.
 
  • Like
Likes gentzen
  • #132
physicsworks said:
This last quote misses the point. Because the detector entangled with the microscopic system in question is a macroscopic object consisting of typically more than ##N=10^{20}## atoms and because our observations are based on its collective coordinates, i.e. quantities averaged over macroscopic numbers of order ##10^{20}##, it turns out that quantum interference effects between two (approximately) classical states of these collective coordinates are suppressed by the double exponential factor of ##e^{-10^{20}}##, and are, even in principle, unobservable.
By a similar argument one could argue that the detailed hamiltonian for such system + macroscopic detector is in principle not inferrable by an observer?

If it is not inferrable, does it have to exist? We have high standards for the observable status of certain contexual things, but not on the hamiltonian, why? Isn't this deeply disturbing?

/Fredrik
 
  • #133
A. Neumaier said:
This is common practice, but not a derivation from first principle.
a derivation of what? The above estimation by Banks is just a way to show that the quoted statement
A. Neumaier said:
you need to derive from quantum statistics that, upon interacting with a particle to be detected, the detector is not - as the Schrödinger equation predicts - in a superposition of macroscopic states with pointer positions distributed according to Born's rule, but that it is in a macroscopic state where one of the pointer positions is actually realized, so that we can see it with our senses.
is a common mistaken belief and, more importantly, to demonstrate in what sense classical mechanics emerges as an approximation of QM. It uses simple counting of states, it is very general (i.e. not relying on a particular model of the macroscopic detector) and is based on first principles, such as the principle of superposition, locality of measurements, etc. It is not a derivation per se because we are dealing with the dynamics of a macroscopic number of particles constituting the detector, for which humans have not yet developed exact solutions of EOMs and probably never will. However, this is OK, and we don't need to wait for them to make such an amazing accomplishment, because we know from the above estimation (rather, underestimation) that the discussed interference effects are not observable even in principle. Not for all practical purposes, but in principle, since any experiment that is set to distinguish between classical and quantum-mechanical predictions of these effects would have to ensure the system is isolated over times that are unimaginably longer than the age of the Universe.
A. Neumaier said:
Nobody doubts that quantum mechanics works in practice, the question is whether this working can be derived from a purely microscopic description of the detectors (plus measured system and environment) and unitary quantum mechanics.
I am sorry, this sounds like circular logic to me. Are you really talking about deriving quantum mechanics from quantum mechanics here? The above estimation is quantum mechanical. We are trying to interpret classical mechanics, not quantum mechanics. Otherwise we would look like Dr. Diehard from the celebrated lecture by Sidney Coleman titled "Quantum mechanics in your face", who thought that deep down, it's classical.
A. Neumaier said:
Progress is made by making the unreachable reachable.
To this regard and in the context of the above discussion, I can quote Banks:
the phrase “With enough effort, one can in principle measure the quantum correlations in a superposition of macroscopically different states”, has the same status as the phrase “If wishes were horses then beggars would ride”.
Fra said:
By a similar argument one could argue that the detailed hamiltonian for such system + macroscopic detector is in principle not inferrable by an observer?
Why? An observer records particular values of collective coordinates associated with the macroscopic detector. As long as this detector, or any other macroscopic object like a piece of paper on which we wrote these values, continues to exist in a sense that it doesn't explode into elementary particles, we can, with fantastic accuracy, use Bayes' rule of conditioning (on those particular values recorded) to predict probabilities of future observations. If those macroscopic objects which recorded our observations by means of collective coordinates cease to exist in the above mentioned sense, then we must go back and use the previous probability distribution before such conditioning was done.
 
  • Like
Likes vanhees71
  • #134
physicsworks said:
a common mistaken belief
Your beliefs and standards are so different from mine that a meaningful discussion is impossible.
 
  • #135
physicsworks said:
However, this is OK, and we don't need to wait for them to make such an amazing accomplishment, because we know from the above estimation (rather, underestimation) that the discussed interference effects are not observable even in principle.

This line of argumentation is not at all convincing to me. We agree on this fact:
  • If probabilities are classical (that is, they represent lack of information; a coin is either heads or tails, but we just don't know which, and we are using probabilities to reason about the uncertainty), then there are no interference effects.
But you seem to be arguing the converse, that if there are no interference effects, then the probabilities must be classical. That's just invalid reasoning, it seems to me.
 
  • #136
@stevendaryl, the argument is quite different from that. Quantum systems do not obey classical rules of probability, but a special class of compatible observables associated with macroscopic objects, the so-called collective coordinates by means of which we record our observations of the microscopic system in question, do approximately obey these rules with an unprecedented accuracy. Classical probability theory with its sum over histories rule and with probabilities representing the lack of information about initial conditions or our ignorance about it, is only an approximation of the probability theory in QM. As with any approximation, it eventually fails; in this case, when we talk about unavoidable quantum uncertainties in the initial position and velocity of a collective coordinate like the center of mass of a detector, the approximation fails if you wait long enough.
 
  • Like
Likes vanhees71
  • #137
physicsworks said:
@stevendaryl, the argument is quite different from that. Quantum systems do not obey classical rules of probability, but a special class of compatible observables associated with macroscopic objects, the so-called collective coordinates by means of which we record our observations of the microscopic system in question, do approximately obey these rules with an unprecedented accuracy. Classical probability theory with its sum over histories rule and with probabilities representing the lack of information about initial conditions or our ignorance about it, is only an approximation of the probability theory in QM.
Even if you want to say it is only an approximation, it seems invalid to me. In the quantum case, we know that the probabilities are NOT due to ignorance.
 
  • #138
Quantum systems cannot obey classical probabilities, much less approximate classical probabilities for some obscure reason.
Assuming that they do is circular reasoning. Nothing in the theory says that quantum systems tend to or must approximate anything classical. Including classical probabilities.
 
  • #139
stevendaryl said:
Even if you want to say it is only an approximation, it’s bogus. In the quantum case, we know that the probabilities are NOT due to ignorance.
What is bogus, exactly? And, of course in QM they are not, the question is to explain in what sense classical world is an approximation to quantum, not the other way around.
EPR said:
Quantum systems cannot obey classical probabilities, much less approximate classical probabilities for some obscure reason.
I am not sure if this is addressed to me, but I will reply with almost an exact quote from the previous message which you probably missed or misunderstood:
physicsworks said:
Quantum systems do not obey classical rules of probability, but a special class of compatible observables associated with macroscopic objects ---- do
approximately, of course.
EPR said:
Assuming that they do is circular reasoning.
No one is assuming that.

I suggest reading Banks, he explains this in much more detail than I do (and much better).
 
  • Like
Likes PeroK
  • #140
physicsworks said:
What is bogus, exactly? And, of course in QM they are not, the question is to explain in what sense classical world is an approximation to quantum, not the other way around.
The question is what, if anything the lack of macroscopic interference terms tells us. I thought you were suggesting that if there aren’t any interference effects, then we might as well assume the ignorance interpretation of probabilities. If you weren’t suggesting that, then what is relevance of the lack of interference effects?
 
  • #141
stevendaryl said:
The question is what, if anything the lack of macroscopic interference terms tells us. I thought you were suggesting that if there aren’t any interference effects, then we might as well assume the ignorance interpretation of probabilities.
I think I see now what you are asking. First, not to be too pedantic, but just to be super clear, it's not that there aren't any interference effects, it's that they are unobservable on humongous time scales compared to the age of the universe. Second, we don't assume, based on the above, the ignorance interpretation of probabilities. We merely note that (in this particular setting, when we talk about collective coordinates of macroscopic objects used to record our observations of a microscopic system entangled with them) predictions based on two calculations, one done in classical probability theory and the other done in full quantum theory, cannot be distinguished by any experiment, even in principle. In his book (and his papers) Banks uses this to explain an apparent classicality of the macroscopic world.
 
  • #142
physicsworks said:
I think I see now what you are asking. First, not to be too pedantic, but just to be super clear, it's not that there aren't any interference effects, it's that they are unobservable on humongous time scales compared to the age of the universe.
Understood, except I would say there are no observable interference effects involving macroscopically distinguishable alternatives, such as a dead versus live cat.
physicsworks said:
Second, we don't assume, based on the above, the ignorance interpretation of probabilities. We merely note that (in this particular setting, when we talk about collective coordinates of macroscopic objects used to record our observations of a microscopic system entangled with them) predictions based on two calculations, one done in classical probability theory and the other done in full quantum theory, cannot be distinguished by any experiment, even in principle. In his book (and his papers) Banks uses this to explain an apparent classicality of the macroscopic world.
Sure, I don’t disagree with that.
 
  • #143
We merely note that (in this particular setting, when we talk about collective coordinates of macroscopic objects used to record our observations of a microscopic system entangled with them) predictions based on two calculations, one done in classical probability theory and the other done in full quantum theory, cannot be distinguished by any experiment, even in principle. In his book (and his papers) Banks uses this to explain an apparent classicality of the macroscopic world.
The above is circular reasoning.

Of course in the classical limit both QM and classical mechanics make approximately the same predictions. If
"Banks uses this to explain an apparent classicality of the macroscopic world."

then Banks doesn't know what he is talking about. You can't assume what you are trying to explain in much the same you don't assume that the reason you have a flat tire is because... you looked and saw that you have a flat tire.
 
  • Like
Likes dextercioby
  • #144
stevendaryl said:
except I would say there are no observable interference effects involving macroscopically distinguishable alternatives, such as a dead versus live cat.
Yes, it all needs to be taken in the above context of dealing with macroscopic objects and the recorded values of corresponding collective coordinates.
stevendaryl said:
Sure, I don’t disagree with that.
I'm glad we understood each other.
EPR said:
The above is circular reasoning.
An example of circular reasoning is the 2nd sentence in #129.
EPR said:
Of course in the classical limit both QM and classical mechanics make approximately the same predictions.
The usual discussion of the "classical limit" of QM is incomplete at best, because it doesn't address neither the decoherence of collective coordinates, nor the locality of interactions.
EPR said:
then Banks doesn't know what he is talking about.
Clearly, you haven't read Banks, so you cannot tell what he is talking about.
 
  • #145
physicsworks said:
Because the detector entangled with the microscopic system in question is a macroscopic object consisting of typically more than ##N=10^{20}## atoms and because our observations are based on its collective coordinates, i.e. quantities averaged over macroscopic numbers of order ##10^{20}##, it turns out that quantum interference effects between two (approximately) classical states of these collective coordinates are suppressed by the double exponential factor of ##e^{-10^{20}}##, and are, even in principle, unobservable. [...] Thus, in dealing with such collective coordinates, we may employ the usual rules of classical probability,
Unfortunately, this claim does not solve the measurement problem. Let ##\psi## be the wave function of a macroscopic system where a macroscopic pointer has, as Banks asserts, the nonzero position ##x## with tiny relative uncertainty. Let ##T## be an operator than moves the pointer from ##x## to ##-x##, and consider the same system in the superposition proportional to ##\psi+T\psi##. (This state can be generated by coupling to a source that moves the pointer if the measured spin turns out to be up, while it does not move it when the spin turns out to be down.) In this nonclassical macroscopic state of the device, the pointer has the very uncertain position ##0\pm O(x)##. Thus Banks statement about classical states is irrelevant for the measurement problem since the collective coordinates do not behave classically on nonclassical states.
physicsworks said:
All of the above is nicely and pedagogically explained in the new QM book by Tom Banks (a great addendum to Ballentine's book, in my opinion).
physicsworks said:
The above estimation by Banks is just a way to show that the quoted statement is a common mistaken belief and, more importantly, to demonstrate in what sense classical mechanics emerges as an approximation of QM. It uses simple counting of states, it is very general (i.e. not relying on a particular model of the macroscopic detector) and is based on first principles
physicsworks said:
To this regard and in the context of the above discussion, I can quote Banks
physicsworks said:
I suggest reading Banks, he explains this in much more detail than I do (and much better).
Did it ever occur to you that Thomas Banks, your source of quantum revelations, might be mistaken in his arguments? You learned from Banks the sophist's art of ''proving'' controversial statements by repeated assertion, but not the science of self-critical logical thinking.

Banks was more self-critical than you (though not enough to see his own errors): While your language tells everyone that you know the truth and the others are mistaken, he explicitly qualifies his controversial statements on p.1 of his book as his personal beliefs:
Thomas Banks: said:
Comparing it to one of the older texts, you will find some differences in emphasis and some differences in the actual explanations of the physics. The latter were inserted to correct what this author believes are errors, either conceptual or pedagogical, in traditional presentations of the subject.
In the piece quoted by you, Banks makes a very surprising probabilistic statement (that collective variables behave classically in arbitrary states) that I never saw anyone else make. Since you didn't want to give references to where he justified his claim I obtained his book and looked for myself. The only ''proof'' of his statement, made first on p.4 and repeated with variations numerous times throughout the book, is by manifold repeated assertion, not by a logical argument. My counterexample shows that there cannot be a valid proof of his assertion.
 
Last edited:
  • Like
Likes weirdoguy, vanhees71, berkeman and 2 others
  • #146
A. Neumaier said:
Let ψ be the wave function of a macroscopic system where a macroscopic pointer has, as Banks asserts, the nonzero position x with tiny relative uncertainty. Let T be an operator than moves the pointer from x to −x, and consider the same system in the superposition proportional to ψ+Tψ. (This state can be generated by coupling to a source that moves the pointer if the measured spin turns out to be up, while it does not move it when the spin turns out to be down.) In this nonclassical macroscopic state of the device, the pointer has the very uncertain position 0±O(x). Thus Banks statement about classical states is irrelevant for the measurement problem since the collective coordinates do not behave classically on nonclassical states.
What above and similar types of reasoning miss completely is that pointer states are ensembles with huge numbers of states that are exponential in ##N=10^{20}##. This is the source of exponentially small overlap of classical histories of collective coordinates. Now, Banks's book (Chapter 10) has not one but three different arguments to show the exponentially small overlap.

[Moderator's note: Off topic content deleted.]
 
Last edited by a moderator:
  • Like
Likes vanhees71
  • #147
physicsworks said:
To this regard and in the context of the above discussion, I can quote Banks:
the phrase “With enough effort, one can in principle measure the quantum correlations in a superposition of macroscopically different states”, has the same status as the phrase “If wishes were horses then beggars would ride”.
The same would apply to the claim of Banks that you are pressing in this thread, about the exponentially small interference terms that take much longer than the lifetime of the universe to observe. If it is wrong to attribute any "reality" to theoretical entities that are in principle unobservable, then it's wrong; but the position you (and Banks) are taking is that it's wrong for other people to do it, but not wrong for you.
 
  • #148
PeterDonis said:
If it is wrong to attribute any "reality" to theoretical entities that are in principle unobservable
these are your words, not Banks's and not mine, so the conclusion you draw from your own words in the next sentence has zero relevance to what Banks is writing.
 
  • #149
physicsworks said:
these are your words
If you want to suggest better words to describe what Banks is asserting in the quote you gave, which I referenced, feel free; I gave the quote to make sure it was clear what I was referring to, so there is no excuse for fixating on the words I used instead of the substantive point. The point I am making is based on the substance of what Banks is asserting in that quote: I am saying that that same substance also applies to Banks' own claims about the exponentially small interference terms. If you have a substantive response to that substantive point, by all means make it.

physicsworks said:
the conclusion you draw from your own words in the next sentence has zero releavance to what Banks is writing
Incorrect, since the conclusion I drew was based on the substance of what Banks said, not on the words I used to describe it.
 
  • #150
PeterDonis said:
If you want to suggest better words to describe what Banks is asserting in the quote you gave, which I referenced, feel free.
I suggest you read Banks's paper itself first, so you know in what context that quote was given and what are actual conclusions that he draws from the smallness of the above discussed overlap: arXiv:0809.3764. Also, you used the word "reality" which is a heavy loaded term. Given that we are in the "quantum interpretation" branch of the forum, you may want to define what you mean by that, before making conclusions about other people's positions.
PeterDonis said:
Incorrect, since the conclusion I drew was based on the substance of what Banks said, not on the words I used to describe it.
I may not be the best "medium" between you and Banks. If you have not read the above paper, you can't make sensible conclusions based on one quote from it.
 
  • Sad
Likes weirdoguy
Back
Top