Undergrad Are there signs that any Quantum Interpretation can be proved or disproved?

  • #91
vanhees71 said:
I always thought that's also the explanation of your "thermal interpretation" until you told me that your expectation values must not be intepreted in the usual statistical sense but as something abstractly defined in the mathematical formalism without relation to an operational realization by a measurement device.
Based on the current discussion, it occurred to me that the non-ensemble interpretation of q-expectations of the thermal interpretation could be combined with Callen's criterion to arrive at an "operational falsification" interpretation of expectations (probability). That interpretation would be closely related to the frequentist interpretation, but fix its problem related to the assumption/requirement of "virtual" ensembles that allow to arbitrarily often repeat identical experiments (which makes the frequentist interpretation non-operational and non-applicable to many practically relevant scenarios).

In order not to hijack this thread, I will open a separate thread with more explanations when I find the time.
 
Physics news on Phys.org
  • #92
  • #93
vanhees71 said:
always thought that's also the explanation of your "thermal interpretation" until you told me that your expectation values must not be interpreted in the usual statistical sense but as something abstractly defined in the mathematical formalism without relation to an operational realization by a measurement device.
In special cases, namely for the measurement of macroscopic properties, the q-expectations are directly related to an operational realization by a measurement device - they give the measured value of extensive quantities without any statistics. No expectations are involved in this case, a single measurement gives the value predicted by the theory.

It is only in the general case where one cannot give a relation to an operational realization by a measurement device except statistically. But this is not a drawback. Already in classical physics, one can relate certain classical observable functions of the state to experiment - namely those that do not depend very sensitively on the state. Those with sensitive dependence can only be related statistically.
 
  • #94
A. Neumaier said:
There must be a microscopic explanation for the ''problem of definite outcomes'' because measurement devices are made out of quantum matter, so they are described by a quantum state. The observed pointer position is a property of the measurement device. According to the statistical interpretation all we can know about the quantum system constituted by the measurement device is encoded in its quantum state. The ''problem of definite outcomes'' is to show how this quantum state encodes the definite observed pointer position, and how the unitary dynamics postulated by quantum physics leads to such a definite observed pointer position.
The solution is quite simple and straightforward. It is sufficient to look at a measurement from two points of view, with different cuts between classical and quantum part. Then we see that the intermediate part is described, in one cut, as a quantum object with a wave function, and in the other cut with a classical trajectory.

All one has to do is to accept this as the general picture - there is also a trajectory in the quantum part. The mathematics how to make both compatible is easy and well known - The Bohmian velocity defines the deterministic (in dBB) resp. average (in other realistic interpretations) velocity of that trajectory.
 
  • #95
Sunil said:
The solution is quite simple and straightforward. It is sufficient to look at a measurement from two points of view, with different cuts between classical and quantum part. Then we see that the intermediate part is described, in one cut, as a quantum object with a wave function, and in the other cut with a classical trajectory.
But Nature has no cut. Thus you only replaced the problem by the equivalent problem of explaining that we may replace the quantum description on one side of the cut by a classical description. Nobody ever has derived this from the pure quantum dynamics.
 
  • Like
Likes Lord Jestocost and vanhees71
  • #96
vanhees71 said:
I still don't understand how there can be evanescent em. waves in the vacuum. For me an evanescent wave is a non-propagating field like in a wave guide (a mode with a frequency below the cut-off frequency), but in the vacuum there is no such thing. The dispersion relation is always ##\omega=ck##, i.e., there are no evanescent modes in the vacuum.
If we write the dispersion relation as ##\omega^2/c^2=k_x^2+k_y^2+k_z^2## and assume that ##k_x## and ##k_y## are real, then we see that ##k_z^2## will get negative if ##\omega^2/c^2<k_x^2+k_y^2##. If ##k_z^2## is negative then ##k_z## is imaginary, which corresponds to an evanescent wave.

At a horizontal planar interface (perpendicular to the z-axis) between two homogeneous regions, ##k_x## and ##k_y## cannot change, because they describe the modulation of the electromagnetic field along the interface. So you can have an optical wave in a glass substrate with well defined ##k_x## and ##k_y## based on the direction of the wave. If the direction of the wave is sufficiently gracing with respect to a horizontal planar interface to vacuum, then it will become evanescent in the vacuum below the interface.
(The wave will quickly (exponentially) vanish with respect to increasing distance from the interface. Additionally, the time average of the z-component of the Poynting vector is zero, i.e. there is no energy transported in z-direction on average by the evanscent wave in vacuum.)
 
  • #97
A. Neumaier said:
But Nature has no cut. Thus you only replaced the problem by the equivalent problem of explaining that we may replace the quantum description on one side of the cut by a classical description. Nobody ever has derived this from the pure quantum dynamics.
Of course. You start with a "pure quantum" description of the world, which in fact does not exist. The minimal interpretation is, essentially, only a reduced Copenhagen interpretation, so it prefers not to talk about that cut, classical part, and all that, but it has the results of the experiments formulated in the language of experiments in classical physics, with resulting classical probabilities (instead of many worlds or so). And you add the explicit hypothesis that there are no "hidden variables", in particular that there is no trajectory, even if we see it if we use the classical description between the two cuts. Because this would not be "pure quantum". And you wonder that you are unable to recreate those trajectories out of nothing after forbidding their existence?

The straightforward solution is, of course, that Nature has no cut, thus, once we see trajectories, it follows that there will be trajectories even in the regions where we are unable to see them. This is not only possible, but straightforward, with the simple mathematics of dBB theory which defines the (average in statistical interpretations) velocity out of the phase of the wave function in configuration space, and which comes essentially without mathematical competitors.

Given that such a straightforward solution with trajectories exists, it would be IMHO reasonable to send all those who propose "pure quantum theory" home until they have done their homework of deriving the trajectories we see around us from their "pure quantum theory" which they like to forbid on the fundamental level.
 
  • Skeptical
Likes PeroK
  • #98
A. Neumaier said:
I agree, but this alone does not solve the problem!

What remains unanswered by the statistical interpretation is why in the measurement of a single particle by the screen, the screen is in a macroscopically well-defined state rather than in a superposition of states where the different pixels are activated with the probabilities determined for Born's rule for the particles. For the latter is the result of applying the Schrödinger equation to the combined system (particle + screen)!The statistical interpretation can never turn a superposition of widely spread possible outcomes (any pixel on the screen) into a state where the outcome is definite. Nothing ever is definite in the statistical interpretation, the definiteness is assumed in addition to the quantum formalism.

The thermal interpretation does no yet claim to have fully solved this problem but paves the way to its solution, since it says that certain q-expectations (rather than certain eigenvalues) are the observed things. Hence the macroscopic interpretation is immediate since the highly coarse-grained macroscopic observables are such q-expectations.

The step missing is to prove from the microscopic dynamics of the joint system (particle + screen)
that these macroscopic observables form a stochastic process with the correct probabilities. Here the thermal interpretation currently offers only suggestive hints, mainly through reference to work by others.
Indeed, "nothing is definite in the statistical interpretation", but that's no bug but a feature as the many highly accurate confirmations of the violation of Bell's inequalities show.

Also the famous double-slit experiment for single particles or photons confirm the predicted probability distributions for the detection of these particles or photons. That a single point on the screen is blackened for each particle registered is first of all an empirical fact. It is also well understood quantum mechanically as already shown as early as 1929 in Mott's famous paper about ##\alpha##-particle tracks in a cloud chamber.

I believe that your thermal interpretation is the answer as soon as you allow your q-expectation values to be interpreted in the standard probabilistic way, and of course you cannot describe the macroscopic observables by microscopic dynamics, because it is their very nature to be only a coarse-grained description of the relevant macroscopic degrees of freedom, and that's also the reason for it's classical behavior and the irreversibility of the measurement outcome.

If you see it as a problem to understand this irreversibility from a detailed microscopic dynamical description then also the same problem has to be considered unsolved within classical physics, but I don't know any physicist who does not accept the standard answer given by statistical physics (aka "the H theorem").
 
  • #99
A. Neumaier said:
In special cases, namely for the measurement of macroscopic properties, the q-expectations are directly related to an operational realization by a measurement device - they give the measured value of extensive quantities without any statistics. No expectations are involved in this case, a single measurement gives the value predicted by the theory.

It is only in the general case where one cannot give a relation to an operational realization by a measurement device except statistically. But this is not a drawback. Already in classical physics, one can relate certain classical observable functions of the state to experiment - namely those that do not depend very sensitively on the state. Those with sensitive dependence can only be related statistically.
But macroscopic properties are statistical averages over many microscopic degrees of freedom. It is not clear how to explain the measurement of such an observable without averages and the corresponding (quantum) statistics.

A single measurement, no matter whether you measure "macroscopic" or "microscopic" properties, never establishes a value, let alone, can test any theoretical prediction, as one learns in the first session of the introductory beginner's lab!
 
  • #100
Sunil said:
You start with a "pure quantum" description of the world, which in fact does not exist.
This is not a fact but your assumption. No known fact contradicts the possibility of a a "pure quantum" description of the world; in contrast, there is no sign at all that a classical description must be used in addition. Once the latter is assumed, one must show how to define the classical in terms of the more comprehensive quantum. This is the measurement problem. You simply talk it away by making this assumption.
Sunil said:
The minimal interpretation is, essentially, only a reduced Copenhagen interpretation, so it prefers not to talk about that cut, classical part, and all that, but
... it postulates a classical world in addition to the quantum world. How the two can coexist is unexplained.
Sunil said:
with the simple mathematics of dBB theory which defines the (average in statistical interpretations) velocity [...]
Given that such a straightforward solution with trajectories exists
It does not exist for quantum field theory, which is needed for explaining much of our world!
 
  • Like
Likes PeroK, gentzen and vanhees71
  • #101
gentzen said:
If we write the dispersion relation as ##\omega^2/c^2=k_x^2+k_y^2+k_z^2## and assume that ##k_x## and ##k_y## are real, then we see that ##k_z^2## will get negative if ##\omega^2/c^2<k_x^2+k_y^2##. If ##k_z^2## is negative then ##k_z## is imaginary, which corresponds to an evanescent wave.

At a horizontal planar interface (perpendicular to the z-axis) between two homogeneous regions, ##k_x## and ##k_y## cannot change, because they describe the modulation of the electromagnetic field along the interface. So you can have an optical wave in a glass substrate with well defined ##k_x## and ##k_y## based on the direction of the wave. If the direction of the wave is sufficiently gracing with respect to a horizontal planar interface to vacuum, then it will become evanescent in the vacuum below the interface.
(The wave will quickly (exponentially) vanish with respect to increasing distance from the interface. Additionally, the time average of the z-component of the Poynting vector is zero, i.e. there is no energy transported in z-direction on average by the evanscent wave in vacuum.)
In the vacuum ##\vec{k}## is a real vector. That's precisely what I don't understand!

If there is a planar interface between two homogeneous regions, there's no vacuum of course, and then there are evanescent waves (aka total reflection).
 
  • #102
vanhees71 said:
"nothing is definite in the statistical interpretation", but that's no bug but a feature
... that leaves unexplained why we see definite things in our world.
vanhees71 said:
That a single point on the screen is blackened for each particle registered is first of all an empirical fact. It is also well understood quantum mechanically as already shown as early as 1929 in Mott's famous paper about α-particle tracks in a cloud chamber.
He didn't show this or claim to have shown it. Mott leaves unexplained why there is a first definite ionization in the first place. He explains only that subsequent ionizations are approximately along a straight line. Thus he explains the tracks assuming the first definite ionization happened somehow.
vanhees71 said:
I believe that your thermal interpretation is the answer as soon as you allow your q-expectation values to be interpreted in the standard probabilistic way
Many q-expectations can be interpreted in the standard probabilistic way, namely all cases where an ensemble of many essentially equally prepared systems is measured. The latter is the assumption on which the statistical interpretation rests. Thus whenever the statistical interpretation applies it is fully compatible with the thermal interpretation.

But there are many instances (in particular most macroscopic measurements) where the statistical interpretation cannot apply since only a single measurement is taken. In these cases the statistical interpretation has no explanatory power at all, while the thermal interpretation still applies.
vanhees71 said:
of course you cannot describe the macroscopic observables by microscopic dynamics, because it is their very nature to be only a coarse-grained description of the relevant macroscopic degrees of freedom
Your ''of course you cannot'' is a fallacy. Nothing forbids that a coarse-grained description is not fully determined by the underlying microscopic reality.

All our physical knowledge suggests the contrary. In many cases we have two description levels amenable to complete mathematical analysis, of which one is a coarse-grained version of the other. In all these cases, the latter turned out to be a well-determined approximation of the former, with rigorously established conditions for the validity of the approximation.

I expect that in the context of the thermal interpretation, the analysis of the quantum measurement process along the lines of Breuer & Pettrucione and Allahverdian, Balian & Nieuwenhuizen will, as outlined in my book, sooner or later reach the same status.

vanhees71 said:
But macroscopic properties are statistical averages over many microscopic degrees of freedom.
Only for an ideal gas. For real matter they are integrals of complicated expressions without any statistics in them. You cannot get the measurable free energy of a substance by averaging microscopic free energies.
vanhees71 said:
A single measurement, no matter whether you measure "macroscopic" or "microscopic" properties, never establishes a value, let alone, can test any theoretical prediction, as one learns in the first session of the introductory beginner's lab!
Ask an engineer or a medical doctor, and he will tell you the contrary. Only highly volatile quantities (such as pointer readings of a macroscopically oscillating pointer or measurements of a single spin) need multiple measurements.
 
  • #103
We see definite things in our world, because we simply don't look in too much detail. Our senses take averages over very many microscopic "events" all the time. The same "mechanisms" apply to all kinds of other "measurement devices". You don't need to prepare many Gibbs ensembles of a gas in a container to observe thermal equilibrium. Gibbs ensembles often are just a useful "Gedankenexperiment" to derive statistical predictions. When putting a thermometer in it to measure its temperature the "averaging" is done dynamically by many hits of the gas molecules with the thermometer and after some time with overwhelming probability thermal equilibrium establishes and thus we can "read off a temperature" on a calibrated scale. Looking in more detail we can of course, also in thermal equilibrium, observe (thermal) fluctuations, which was the reason why finally in the beginning of the 20th century the statistical physics approach and the discovery of the atomistic nature of matter got accepted by the physics community.

Of course, sometimes we indeed have to work with "Gibbs ensembles", as, e.g., in particle (and heavy ion) physics. It's not too surprising that also there statistical physics works very well, sometimes even equilibrium statistical physics. Of course, this is partially just due to looking at sufficiently coarse-grained observables, e.g., just the particle abundancies in a heavy-ion collision averaging over very many events amazingly accurately are described by (grand) canonical ensembles (reaching from the most abundant species like pions to very rare ones like light nuclei and antinuclei).

Also I didn't claim that the macroscopic behavior is not described by the underlying microscopic dynamics. To the contrary it is derivable from it by quantum statistics. The only point is that it describes the macroscopic coarse-grained observables, relevant for the description at the level of accuracy of the observed phenomena, and not the microscopic irrelevant details. The latter can become relevant at higher resolution of the observables, and then you have to change the level of description to describe these more detailed then becoming "relevant" observables. That's all understandable within the paradigm of the statistical approach. It's only blurred by your thermal interpretation, if you forbid to interpret the expectation values in the usual statisical/proabilistic way.
 
  • Like
Likes physicsworks and WernerQH
  • #104
I think "thermal interpretation" is a misnomer. How can you call it an interpretation if it carefully avoids the question what quantum theory is about. It is more like an empty shell, big enough to contain theories as diverse as quantum mechanics and thermodynamics.
 
  • Skeptical
Likes PeroK
  • #105
vanhees71 said:
Our senses take averages over very many microscopic "events" all the time.
This is a hypothesis without sufficient basis.

Our senses are physical objects. Thus they don't do mathematical operations of averaging. Instead they behave according to physical laws governed by quantum theory. So whatever they do must be deducable from quantum mechanics. But in quantum mechanics there is no notion of microscopic "events" happening in time - unless you define these in quantum mechanical terms, which you never did.

vanhees71 said:
Gibbs ensembles often are just a useful "Gedankenexperiment" to derive statistical predictions.
But Nature doesn't care about our gedanken but operates on the basis of dynamical laws. Without an actual ensemble no actual averaging, hence no actual statistics, hence Born's rule does not apply.

vanhees71 said:
the "averaging" is done dynamically by many hits of the gas molecules
This assumes that the gas molecules are classical objects that can hit the thermometer. But in standard quantum mechanics all you have is potentialities, nothing actually happening, except in a measurement.


vanhees71 said:
Also I didn't claim that the macroscopic behavior is not described by the underlying microscopic dynamics. To the contrary it is derivable from it by quantum statistics.
Then you need to derive from quantum statistics that, upon interacting with a particle to be detected, the detector is not - as the Schrödinger equation predicts - in a superposition of macroscopic states with pointer positions distributed according to Born's rule, but that it is in a macroscopic state where one of the pointer positions is actually realized, so that we can see it with our senses.

Since you claim that this is derivable, please show me a derivation! If valid, it would solve the measurement problem!
 
Last edited:
  • Like
Likes gentzen
  • #106
WernerQH said:
I think "thermal interpretation" is a misnomer. How can you call it an interpretation if it carefully avoids the question what quantum theory is about.
If you look at my book (or the earlier papers), you'll see that it does not avoid at all the question what quantum theory is about, but makes many statements about it that are different from the standard interpretations.
WernerQH said:
It is more like an empty shell, big enough to contain theories as diverse as quantum mechanics and thermodynamics.
Thus it is not an empty shell but a full shell - able to interpret both.
 
  • Like
Likes PeroK and gentzen
  • #107
A. Neumaier said:
This is not a fact but your assumption. No known fact contradicts the possibility of a a "pure quantum" description of the world; in contrast, there is no sign at all that a classical description must be used in addition.
Except the failure of delivering such a description. (I don't accept MWI as being an consistent interpretation. At least I have never seen a consistent version.) Nonexisting theories cannot be discussed. And therefore they cannot count.
A. Neumaier said:
... it postulates a classical world in addition to the quantum world. How the two can coexist is unexplained.
Yes. That's the problem. Which does not exist in realistic interpretations.
A. Neumaier said:
It does not exist for quantum field theory, which is needed for explaining much of our world!
It exists for QFT. All you need for this is to use the fields to define the configuration space.

Bohm.D., Hiley, B.J., Kaloyerou, P.N. (1987). An ontological basis for the quantum theory, Phys. Reports 144(6), 321-375
 
  • #108
Sunil said:
Except the failure of delivering such a description. (I don't accept MWI as being an consistent interpretation. At least I have never seen a consistent version.) Nonexisting theories cannot be discussed. And therefore they cannot count.
MWI is not the only "pure quantum" description, the thermal interpretation too has no additional non-unitary dynamics. But since you are obviously unaware of it, it would probably not help if I tried to discuss it with you. So let me try something else instead.

The Bose-Einstein statistics was derived and published in 1924 before the invention of the Born rule (and other elements of the Copenhagen interpretation). So I guess that its derivation does not depend on any specific interpretation of quantum mechanics. And I guess the same is true for derivations of other thermal states. For a quantum experiment in a laboratory, instead of trying to prepare a special pure state, it can also make sense to just prepare a "nice" thermal state. Now state preparation and measurement are closely related. So for the measurement too, it can make sense to just measure some nice smooth thermodynamical property, instead of going for some discrete property exhibiting quantum randomness. And again, it is quite possible that the derivation of such smooth thermodynamical properties does not depend on any specific interpretation of quantum mechanics like MWI, Copenhagen, or de Broglie-Bohm.
 
  • #109
gentzen said:
MWI is not the only "pure quantum" description, the thermal interpretation too has no additional non-unitary dynamics. But since you are obviously unaware of it, it would probably not help if I tried to discuss it with you. So let me try something else instead.

The Bose-Einstein statistics was derived and published in 1924 before the invention of the Born rule (and other elements of the Copenhagen interpretation). So I guess that its derivation does not depend on any specific interpretation of quantum mechanics. And I guess the same is true for derivations of other thermal states. For a quantum experiment in a laboratory, instead of trying to prepare a special pure state, it can also make sense to just prepare a "nice" thermal state. Now state preparation and measurement are closely related. So for the measurement too, it can make sense to just measure some nice smooth thermodynamical property, instead of going for some discrete property exhibiting quantum randomness. And again, it is quite possible that the derivation of such smooth thermodynamical properties does not depend on any specific interpretation of quantum mechanics like MWI, Copenhagen, or de Broglie-Bohm.
There are many approaches who aim to present a "pure quantum" description, but IMHO they all fail to deliver. (I specially mentioned MWI only because I have a quite special opinion about MWI, namely that it is not even a well-defined interpretation, the phrase "credo quia absurdum" seems to be about MWI believers.) Roughly, these are all interpretations which attempt to solve the measurement problem without really solving it. (In realistic interpretations it does not exist, given that they have a trajectory already on the fundamental level.)

I don't get the point of your consideration. Of course, one can use thermodynamics in state preparation as well as in measurements.
 
  • Haha
  • Skeptical
Likes PeroK and atyy
  • #110
A. Neumaier said:
No known fact contradicts the possibility of a a "pure quantum" description of the world; in contrast, there is no sign at all that a classical description must be used in addition. Once the latter is assumed, one must show how to define the classical in terms of the more comprehensive quantum. This is the measurement problem. You simply talk it away by making this assumption.
The "sign" I see is that degree to which the the quantum framework is well defined, is directly dependent on the degree which the reference (classical background) is not limiting in terms of information capacity and processing power. The laws of the quantum dynamics are inferred depdend on an inference machinery living in the classical domain (which in the ideal picture is unlimited).

Once we relax the firmness of this reference, and inference machinery, we likely need to acqknowledge that some of the deduce power of QM formalism are invalid, and we need to reconstruct our "measurement theory" based on a non-classical reference.

So if we by "pure quantum" means the quantum formalism including hamiltonians and laws - as inferred relative classical background - but what remains when one discards the "classical baggage", that really makes no sense to me. To me it's mathematical extrapolation that I think is unlikely to be the right way to remove the classical background. If it works, I agree it would be easier, so it makes sense to entertain the idea but conceptually I think it's flawed.

/Fredrik
 
  • #111
Sunil said:
There are many approaches who aim to present a "pure quantum" description
Do you mean neo-Copenhagen non-representationalist approaches like QBism or RQM? Or approaches like consistent histories that remain agnostic on realism (or representation) and have no problems with being incomplete (at least as far as Roland Omnès or Robert Griffiths are concerned)? Or the minimal statistical interpretation?
MWI and the thermal interpretation are different from those in being realist (i.e. representationalist) approaches that aim for completeness. I am not aware of other candidates in this group that maintain a "pure quantum" description without additional non-unitary dynamics.

Sunil said:
I don't get the point of your consideration. Of course, one can use thermodynamics in state preparation as well as in measurements.
The point is that you don't need the projection postulate (and probably not even the Born rule in any form whatsoever) to derive the statistics of the thermal state or prepare a system in it. To prepare a system in it, you probably just have to control the thermodynamical degrees of freedom and keep them constant long enough for the system to settle/converge sufficiently to the thermal state. So you can get away with the much weaker assumptions that the thermodynamical degrees of freedom can be controlled and measured. At least I guess that this assumption is weaker than assuming the existence of a classical description plus some version of the Born rule.
 
  • Like
Likes A. Neumaier
  • #112
gentzen said:
Do you mean neo-Copenhagen non-representationalist approaches like QBism or RQM? Or approaches like consistent histories that remain agnostic on realism (or representation) and have no problems with being incomplete (at least as far as Roland Omnès or Robert Griffiths are concerned)? Or the minimal statistical interpretation?
Yes. One should exclude "interpretations" which are in fact different theories (GRW and so on) which explicitly introduce collapse dynamics. And all the realistic interpretations which explicitly introduce a configuration space trajectory. Then there is Copenhagen and similar interpretations which do not get rid of the classical part. Everything else can be roughly classified as attempts to construct a pure quantum interpretation.

Interpretations accepting being incomplete are an interesting question. On the one hand, this removes the most serious fault of Copenhagen. Then, one can ask if there is a measurement problem at all if one accepts that the theory is incomplete. That would mean that the Schrödinger evolution would be approximate. If so, what would be the problem with a collapse? It would be simply the place where the subquantum effects reach the quantum level.

MWI and the thermal interpretation are different from those in being realist (i.e. representationalist) approaches that aim for completeness. I am not aware of other candidates in this group that maintain a "pure quantum" description without additional non-unitary dynamics.
gentzen said:
The point is that you don't need the projection postulate (and probably not even the Born rule in any form whatsoever) to derive the statistics of the thermal state or prepare a system in it. To prepare a system in it, you probably just have to control the thermodynamical degrees of freedom and keep them constant long enough for the system to settle/converge sufficiently to the thermal state.
Hm, does this method allow to cover all preparation procedures? Plausibly yes, if one applies the same argument as in dBB theory for the non-configuration observables. Whatever we can measure, the result can be identified from the classical measurement device by looking only at its configuration too. Looking only at the thermodynamical degrees of freedom of the measurement device would be sufficient too.

But what about preparation with devices which shoot single particles from time to time? Say, a piece of radioactive material? To prepare something in a stable state is one thing - the source itself can be described as being in a stable state. But states of fast moving particles being prepared with "keep them long enough" does not sound very plausible.

The other point is that thermodynamics is also incomplete description of something else. (Especially in the Bayesian approach.) And this something else can become visible. Say, the early states of the universe may have been quite close to equilibrium, but with changing temperature this become unstable and where we live today appeared out of some fluctuation. We could not have seen that place being different from others looking at the thermodynamic variables of the early universe.
 
  • #113
Sunil said:
But what about preparation with devices which shoot single particles from time to time? Say, a piece of radioactive material? To prepare something in a stable state is one thing - the source itself can be described as being in a stable state. But states of fast moving particles being prepared with "keep them long enough" does not sound very plausible.
I basically agree that there are many experiments that are not covered by preparation in the thermal state or by measurement of thermal degrees of freedom. I try to elaborate a bit below, why I did bring up the thermal state nevertheless.

Sunil said:
Hm, does this method allow to cover all preparation procedures?
Probably not, but it covers more preparation procedures than the MWI mindset typically acknowledges. Even if an experiment is prepared in a thermal state, typical quantum interference effects still occur at interfaces or surfaces, because locally the state can look much more pure than it looks from a less local perspective.
It doesn't cover all preparation procedures from my instrumentalistic point of view. And I guess also not from the point of view of the thermal interpretation, but for different reasons.

If I have to simulate the interaction of an electron beam with a sample, then for me (as instrumentalist) what counts as prepared is the sample (in a thermal state) and the beam (in a rather pure state) incident on the sample. Independent of whether the electron source was in a thermal state, there was some filtering going on (with associated randomness - quantum or not) in the preparation of the beam. For me, this filtering is part of the preparation procedure, and it probably depends on interpretative assumptions. And in any case, the combined system of sample and beam does not seem to be in a thermal state.

Even more extreme, error correction schemes in (superconducting) quantum computing require measuring some qubits and then modifying the unitary dynamics based on results of that measurement. I can't believe that measurements of thermodynamical degrees of freedom are good enough for that. After all, the quantum randomness is crucial here for that correction scheme to work.

(That quantum computing example is part of the reason why I bring up the thermal state. I have the impression that when Stephen Wolfram and Jonathan Gorard conclude (1:25:08.0 SW: in the transcipt) "..., it’s still a little bit messy, but it seems that you can never win with quantum computing, ...", then they are slighlty too much in the MWI mindset, which cares more about the supposedly pure state of the universe as a whole than about the mostly thermal states locally here on earth.)
 
  • #114
vanhees71 said:
In the vacuum ##\vec{k}## is a real vector. That's precisely what I don't understand!

If there is a planar interface between two homogeneous regions, there's no vacuum of course, and then there are evanescent waves (aka total reflection).
The tricky part is probably that when I try to measure the evanescent wave, then I will bring some kind of detector (for example the photoresist) close to the planar interface. And then all that is left from the vacuum is a very thin layer, thin enough such that evanescent wave did not vanish yet at the detector surface. A (huge) part of the evanescent wave will be reflected by that surface, and the time average of the z-component of the Poynting vector of the superposition of the reflected part with the "initial" evanescent wave is no longer zero. So you no longer have perfect total reflection, if you (successfully) measure the evanescent wave.

Maybe it helps you to imagine the very thin layer of vacuum between the planar interface and the detector surface as some sort of waveguide.
 
  • Like
Likes A. Neumaier
  • #115
It's not vacuum if you there is a planar interface! There are no evanescent waves in the vacuum.
 
  • #116
vanhees71 said:
It's not vacuum if you there is a planar interface! There are no evanescent waves in the vacuum.
Yes, far away from interfaces, there are no evanescent waves in vacuum. Just to clarify that your remark about the planar interface only applies to evanescent waves. It is still valid to talk about the optical waves in vacuum, even so there is a planar interface. The reason is that they will still be there far away from the interface, where it is fine to say that there is vacuum.
 
  • Like
Likes vanhees71
  • #117
With the advent of QFT which successfully unites the quantum and classical scale, these so called interpretations are no longer needed. They are a relic of the 40's , 50's and 60's when QM was the sole actor on the stage.

The world as described by the successor of QM(QFT) is composed entirely of fields and the everyday stuff like chairs and tables are secondary manifestations(particular state of the fields). This is the comprehensive picture that unites the scales. It might be at odds with your preconceived notions but Nature doesn't care. The world is what it is.
These so-called interpretations also make a lot of sacrifices and introduce various kinds of weirdness, so it's not totally unexpected that some prejudices need to go. The world of the senses is real but is not fundamental. I believe this is what stunned Neils Bohr when he said "If you are not shocked, you have not understood it yet"
 
  • Like
Likes vanhees71
  • #118
EPR said:
With the advent of QFT which successfully unites the quantum and classical scale, these so called interpretations are no longer needed. They are a relic of the 40's , 50's and 60's when QM was the sole actor on the stage.
Would this mean that the so-called measurement problem is mere imagination? If not, does QFT resolve it?
 
  • #119
timmdeeg said:
Would this mean that the so-called measurement problem is mere imagination? If not, does QFT resolve it?
No. It's less obvious but is still there. In QFT 'particles' are created and annihilated at a particular location.
 
  • #120
gentzen said:
The Bose-Einstein statistics was derived and published in 1924 before the invention of the Born rule (and other elements of the Copenhagen interpretation). So I guess that its derivation does not depend on any specific interpretation of quantum mechanics.
It’s not clear to me that Bose-Einstein statistics are particularly “quantum”. Their derivation is the same as Maxwell-Boltzmann statistics except the assumption of indistinguishability (in counting the number of states).
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 292 ·
10
Replies
292
Views
11K
  • · Replies 45 ·
2
Replies
45
Views
7K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 376 ·
13
Replies
376
Views
21K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K