I Are there signs that any Quantum Interpretation can be proved or disproved?

timmdeeg
Gold Member
Messages
1,534
Reaction score
340
The concept of decoherence seems to be a major progress in quantum mechanics. Has decoherence or any other new finding the potential that a particular interpretation of quantum mechanics will prove correct or incorrect resp. in the foreseeable future?
 
Last edited:
  • Like
Likes LCSphysicist
Physics news on Phys.org
Before proving a particular interpretation correct, it might be more interesting to prove a particular interpretation incorrect. (Interesting in the sense of observing how its proponents will react.) Just like elliptic and hyperbolic geometry proved some intuitive claims incorrect. Even an interpretation that has been proven incorrect will not just vanish, but will probably just slightly adapt itself such that it is either no longer incorrect, or at least needs significant more effort to be proven incorrect.
 
  • Like
Likes timmdeeg and Demystifier
Agreed, I have edited #1 accordingly, thanks.

Does decoherence or any other progress improve our understanding of the measurement problem? This unresolved problem seems to be the starting point for contradictory interpretations.
 
  • Like
Likes gentzen
Decoherence has improved the understanding of the measurement problem, but only for the part which recognizes that it does not solve the measurement problem. For those who think that it has solved the measurement problem it has added only confusion.

Decoherence solves a different problem, namely it allows to distinguish those cases where the strange quantum predictions related with superpositions and entanglement lead to probabilities different from classical predictions, with all those interference effects, from those where classical probabilities are a good approximation for computing the probabilities.

But Schroedinger's cat is not about this. Knowing the classical observables counted by the Geiger counter we can with certainty predict what happens with that poor cat, but this does not help us at all understanding it.
 
  • Like
Likes timmdeeg
I was unaware that "decoherence" counts as an interpretation of QM, and I don't think it constitutes real progress. But one sign that does make me optimistic is the empirical fact that QM can be successfully applied without knowing the solution of the so-called "measurement problem". Another sign is the Schwinger / Keldysh formalism that seamlessly joins the disparate processes of unitary evolution and measurement.

In my view the various interpretations of QM are attempts to fit obsolete metaphysical baggage into quantum theory. In the case of electrodynamics it took physicists four decades to realize that it is a perfect theory without a mechanical model of the ether. After almost ten deacdes it's about time to find a formulation of quantum theory that does not rely on the concept of "measurement".
 
WernerQH said:
I was unaware that "decoherence" counts as an interpretation of QM
It doesn't but nobody has mentioned it does.
 
timmdeeg said:
It doesn't but nobody has mentioned it does.
You were talking about "progress". Which interpretation, if any, has it made more plausible?
 
WernerQH said:
You were talking about "progress". Which interpretation, if any, has it made more plausible?
This is what I was asking in post #1.

Could you elaborate a bit about "the Schwinger / Keldysh formalism"?
 
Sunil said:
Decoherence has improved the understanding of the measurement problem, but only for the part which recognizes that it does not solve the measurement problem. For those who think that it has solved the measurement problem it has added only confusion.
So from this point of view decoherence doesn't support a particular interpretation, correct?
 
  • #10
Neumaier's thermal interpretation of quantum physics proposes a novel solution of the measurement problem. I am not at all familiar with this proposal but shouldn't it support a particular interpretation depending on how it attempts to solve the measurement problem?
 
Last edited:
  • #11
timmdeeg said:
Could you elaborate a bit about "the Schwinger / Keldysh formalism"?
Schwinger's pioneering paper (J. Math. Phys 2, 407) is sixty years old, and seems to have attracted most interest in the field of non-equlibrium processes (which is appropriate for real measurements involving Geiger counters, photo multipliers, bubble chambers etc!)
The central idea is to express observable quantities directly in terms of an integral over a closed time-path extending over a forward and a backward time branch. Unitary evolution and the Born rule are built in from the start; there's no need to talk about measurements. Interference effects are accounted for automatically, because on the backward path particles can travel on a different path than in the forward direction.
In QFT one frequently contents oneself with computing an S-matix element ## S_{fi} ## and then taking the squared modulus. The correct way to include interference terms of competing processes is to compute the product of two S-matrices for forward and backward times (which can be distinct). Of course this is more work, and people do this only when they have to. :-)

John Cramer proposed a "transactional interpretation" of QM which introduced quite similar ideas. But I believe one cannot consider the "offer" and "confirmation" waves of that formalism as something physically real; they are just pieces of a mathematical apparatus ("propagators").
 
  • Like
Likes timmdeeg
  • #12
gentzen said:
Before proving a particular interpretation correct, it might be more interesting to prove a particular interpretation incorrect.
You can't prove any QM interpretation incorrect since they all make the same predictions for all experimental results.

timmdeeg said:
Does decoherence or any other progress improve our understanding of the measurement problem?
IMO no. If provides a lot of useful detail if you already have a solution to the measurement problem, but it doesn't help you to find any such solution.

timmdeeg said:
decoherence doesn't support a particular interpretation, correct?
Correct. It can't, since, as noted above, all QM interpretations make the same predictions for all experimental results, and that is because they all use the same (or at least equivalent) underlying math, and decoherence is part of the underlying math.
 
  • Like
Likes bhobba and timmdeeg
  • #13
timmdeeg said:
So from this point of view decoherence doesn't support a particular interpretation, correct?
Correct.
 
  • Like
Likes timmdeeg
  • #14
WernerQH said:
Schwinger's pioneering paper (J. Math. Phys 2, 407) is sixty years old, and seems to have attracted most interest in the field of non-equlibrium processes (which is appropriate for real measurements involving Geiger counters, photo multipliers, bubble chambers etc!)
The central idea is to express observable quantities directly in terms of an integral over a closed time-path extending over a forward and a backward time branch. ...
Thanks for your explanations. I had suspected that the Schwinger- Keldidysh-formalism would shed some light on my question but it seems it doesn't. Thanks again.
 
  • #15
PeterDonis said:
You can't prove any QM interpretation incorrect since they all make the same predictions for all experimental results.
Isn't the main issue our lack of understanding the measurement process? The prediction that there are ManyWorlds can't be proved experimentally but - if I see it correctly - would gain plausibility in case an improved understanding of the measurement process would suggest that the wave function doesn't collapse. Or otherwise that the wave function reduces to a single eigenstate - not in contradiction to QM - would strengthen the instrumentalist interpretation. Sorry the latter is very vague and questionable.
 
  • #16
timmdeeg said:
Thanks for your explanations. I had suspected that the Schwinger- Keldidysh-formalism would shed some light on my question but it seems it doesn't. Thanks again.
It may be helpful as a change of perspective. We may have been asking the wrong questions. As I said, QT can be applied successfully without solving the measurement problem (or identifying the "definitive" interpretation among the plethora of interpretations that have been proposed). I believe that none of the present interpretations will survive. On the other hand, the "forward" and "backward" times that Schwinger introduced may be more than a purely formal device. If we think of events (points in space-time) as really occurring in close pairs existing on two separate time "sheets", we can view QFT as what it is: a statistical theory describing patterns (correlations) of events in space-time.
 
  • #17
WernerQH said:
It may be helpful as a change of perspective. We may have been asking the wrong questions. As I said, QT can be applied successfully without solving the measurement problem.
"applied" yes, this is the instrumentalist perspective, but understood? There is the measurement problem, not understood till today. Of course you can say there is nothing to understand, Zeilinger argues in this sense. But this is an interpretation, so how can you be sure?
WernerQH said:
I believe that none of the present interpretations will survive.
Why do you think that? Do argue with "forward" and "backward" times that Schwinger introduced"?
 
  • #18
timmdeeg said:
"applied" yes, this is the instrumentalist perspective, but understood? There is the measurement problem, not understood till today.
No, I am not an instrumentalist. I also strive for a deeper understanding, and I think there's a big "aha" moment in store. Contrary to Bohr who thought that human understanding will forever be tied to "complementary" concepts of classical physics. And QM must in some sense be "transcendental".

timmdeeg said:
Why do you think that? Do argue with "forward" and "backward" times that Schwinger introduced"?
Yes. I believe that QT is a microscopic theory that can (and must!) be formulated without recourse to classical concepts. Certainly not "measurements" using "classical" apparatus. The processes in the interior of the sun are now fairly well understood -- why insist on a description involving measurements?

Particles and fields are fundamentally classical ideas, though many physicists insist that quantum particles and quantum fields are quite different. Particles are problematic because they don't always have definite properties, or only when "measured". The two photons in the Aspect et al. experiments would have to engage in superluminal communication to produce the observed correlations. The description of the experiment is simple only when we focus on the production and detection events. On what happens between the source and the detectors the theory remains silent. That's why I said QFT is a theory of events.
 
  • Like
Likes timmdeeg
  • #19
timmdeeg said:
Isn't the main issue our lack of understanding the measurement process?
No. The main issue is that in standard QM, there is no way to experimentally test whether the wave function collapses or not. See further comments below.

timmdeeg said:
The prediction that there are ManyWorlds can't be proved experimentally but - if I see it correctly - would gain plausibility in case an improved understanding of the measurement process would suggest that the wave function doesn't collapse. Or otherwise that the wave function reduces to a single eigenstate - not in contradiction to QM - would strengthen the instrumentalist interpretation.
There is no way to test these alternatives within standard QM. It should be obvious that there can't be, since both MWI and collapse interpretations are interpretations of standard QM and share the same underlying math and make the same experimental predictions.

The only way to actually resolve the measurement problem will be to discover a different theory that makes different predictions from standard QM, and which is inconsistent with one of the two groups of interpretations of our current QM ("no collapse" or "collapse"). The problem is that every proposal so far for such a different theory has not worked; it has made predictions that were easily falsified.
 
  • Like
Likes timmdeeg
  • #20
PeterDonis said:
The only way to actually resolve the measurement problem will be to discover a different theory that makes different predictions from standard QM, and which is inconsistent with one of the two groups of interpretations of our current QM ("no collapse" or "collapse"). The problem is that every proposal so far for such a different theory has not worked; it has made predictions that were easily falsified.
Thanks for this clear statement. It seems to include Neumaier's proposal which I mentioned in post #10, right?
 
  • #21
timmdeeg said:
It seems to include Neumaier's proposal which I mentioned in post #10, right?
I'm not sure. Neumaier calls his proposal an "interpretation" (the thermal interpretation), but I suspect that if carried far enough it would make predictions different from standard QM for some situations.
 
  • Like
Likes timmdeeg
  • #22
WernerQH said:
Particles and fields are fundamentally classical ideas, though many physicists insist that quantum particles and quantum fields are quite different. Particles are problematic because they don't always have definite properties, or only when "measured". The two photons in the Aspect et al. experiments would have to engage in superluminal communication to produce the observed correlations. The description of the experiment is simple only when we focus on the production and detection events. On what happens between the source and the detectors the theory remains silent. That's why I said QFT is a theory of events.
As I understand it Aspect et al proved quantum non-locality and not superluminal communication. This and the silence of the theory you mentioned has to be accepted unless we have an advanced theory. I am not knowledgeably enough to understand your point regarding QFT though.
 
  • #23
timmdeeg said:
As I understand it Aspect et al proved quantum non-locality and not superluminal communication.
Yes. I don't believe in superluminal communication either. That would be required if you would think of photons as particles carrying polarization information with them.

timmdeeg said:
This and the silence of the theory you mentioned has to be accepted unless we have an advanced theory.
I don't think we will ever have an "advanced" theory (a theory of "measurements"). We have to make sense of QFT as it is.
 
  • #24
WernerQH said:
We have to make sense of QFT as it is.
Interesting point, are you aware of any attempts?
 
  • #25
timmdeeg said:
Interesting point, are you aware of any attempts?
Many people seem to think that if QM is beyond human understanding, then QFT must be even more so. But it's the other way round, and a wrong approach trying to "understand" QM first, because it is a chimera, half quantum, half classical. QFT is the real thing that can also describe spontaneous emission from an excited atom, for example.

I consider it a waste of time to discuss the collapse of wave functions. To me it's obvious that a time-dependent wave function describes only an average of sorts; the time-dependent Schrödinger equation with continuous, even deterministic evolution is completely at odds with the graininess and randomness in the real world. And "measurement" doesn't help, because a free neutron will decay even in the absence of a detector.

Condensed matter physicists have been dealing successfully with the graininess of matter. And they use QFT (or statistical fied theory) as a phenomenological theory to describe fluctuations and correlations. They feel confident that they deal with really existing structures in, say, liquid helium or spin systems, not something that is brought about through their "measurements".
 
  • #26
I came across this paper

https://arxiv.org/pdf/1209.2665.pdf

where Figari&Teta are discussing the Mott Problem.

Page 6:
A further point to be emphasized concerns the relevance of the result obtained in the approach b). It is shown that, under the right conditions and with high probability, one can only observe straight tracks. This is a highly non trivial result obtained exploiting Schr¨odinger dynamics without having any recourse to the wave packet reduction rule. The reduction should only be invoked at the stage of the ”observation of the actual track” described by the α-particle.

Does "without having any recourse to the wave packet reduction rule" mean that the wavefunction doesn't collapse? And if true shouldn't this paper add an enormous contribution to understanding the measurement problem?
 
  • #27
timmdeeg said:
Does "without having any recourse to the wave packet reduction rule" mean that the wavefunction doesn't collapse?
No. Wave function collapse is interpretation dependent. The paper you linked to is not talking about any particular interpretation; it is talking about Mott's derivation of straight-line cloud chamber tracks using basic QM, and showing that the projection postulate (rule 7 in PF's 7 Basic Rules) only has to be invoked at the very end, when the complete track is observed.

timmdeeg said:
And if true shouldn't this paper add an enormous contribution to understanding the measurement problem?
You seem to equate "wave function collapse" with "the measurement problem". That's not correct. Even no collapse interpretations like the MWI still have the measurement problem.
 
  • Like
Likes timmdeeg
  • #28
PeterDonis said:
No. Wave function collapse is interpretation dependent. The paper you linked to is not talking about any particular interpretation; it is talking about Mott's derivation of straight-line cloud chamber tracks using basic QM, and showing that the projection postulate (rule 7 in PF's 7 Basic Rules) only has to be invoked at the very end, when the complete track is observed.
Ah I see, thanks.
PeterDonis said:
You seem to equate "wave function collapse" with "the measurement problem". That's not correct. Even no collapse interpretations like the MWI still have the measurement problem.
Wikipedia says simply "the measurement problem considers how, or whether, wave function collapse occurs." Do you agree with that or instead how do you define the "measurement problem"?
 
  • #29
PeterDonis said:
You can't prove any QM interpretation incorrect since they all make the same predictions for all experimental results.
The Penrose interpretation with gravitationally induced collapse or other spontaneous collapse interpretations like GRW do make different predictions for some experimental results, at least in theory. In practice, their parameters are adjusted appropriately such that it is very hard to disprove them by experiment. This is one of the things I had in mind when I wrote: "an interpretation that has been proven incorrect will ... slightly adapt itself such that it ... needs significant more effort to be proven incorrect".

However, I don't want to trivialize my suggestion that proving a particular interpretation incorrect would be interesting, and that often an interpretation proven incorrect "will probably just slightly adapt itself such that it is ... no longer incorrect". (Well, I am not sure what PeterDonis is getting at, so I also want to offer a less ironclad answer which offers more opportunities for being attacked.)

Let me give a non-trivial example by answering a question based on an objection to Everett I found in Carl Friedrich von Weizsäcker's Der Garten des Menschlichen (1977):
Of course Hume is right that justifying induction by its success in the past is circular. Of course Copenhagen is right that describing measurements in terms of unitary quantum mechanics is circular. Of course Poincaré is right that defining the natural numbers as finite strings of digits is circular. …

But this circularity is somehow trivial, it doesn’t really count. It does make sense to use induction, describing measurement in terms of unitary quantum mechanics does clarify things, and the natural numbers are really well defined. But why? Has anybody ever written a clear refutation explaining how to overcome those trivial circularities? …
To clarify: Edward Teller tells a story about Bohr suggesting that he would have brought up Everett's idea and Bohr would have uttered some enigmatic words, which must be translated into von Weizsäcker's objection, and that accidentally von Weizsäcker happened to be there too. Who knows, but at least both Teller and von Weizsäcker are pupils of Heisenberg that consistently defend his version of the Copenhagen interpretation, and the objection is consistent with Heisenberg's philosophy.

The answer to the objection is that arguments using decoherence factor the Hilbert space into subsystems and the environment. This designation of an environment is a structure in addition to the Hilbert space and the Hamiltonian. Even more, this designation of an environment is typically based on the space-time structure of the world, and so implicitly brings the 3+1 dimensional structure (so important for our everyday experiences) back into Everett QM. As Jan-Markus Schwindt wrote in Nothing happens in the Universe of the Everett Interpretation:
If we take QM in the EI, the mathematical structure is the state vector and its time evolution. This structure has actually turned out to be empty if it stands on its own. It becomes a nontrivial structure only in relation to an external observer, or through the interaction with an environment which is not part of the state vector already.
...
A structure is a structure only with respect to some observer or environment outside the structure who reads off the structure as a structure in a specific way.
 
  • #30
timmdeeg said:
Does "without having any recourse to the wave packet reduction rule" mean that the wavefunction doesn't collapse?
It is important to distinguish the wave function of the alpha-particle from the wave function in the 3 * (N+1) dimensional configuration space of the "whole" system including the gas of the cloud chamber. Tracing out the uninteresting coordinates of the gas molecules is certainly a non-unitary operation, and the result for the alpha-particle would again have spherical symmetry (and it would be misleading to still call it a "wave function").

What the formalism tells us is that the drops are vastly more likely to form straight lines than erratic zig-zag curves. Similarly the Feynman path integral tells us that the points at which the alpha-particle interacts are likely to be very close to a path that has the least action. Wave function collapse is just a figure of speech that has no basis in the formalism.
 
  • Like
Likes timmdeeg
  • #31
timmdeeg said:
Wikipedia says
You've been here long enough to know not to use Wikipedia as a reference, particularly for a topic like this. :wink:

timmdeeg said:
how do you define the "measurement problem"?
The measurement problem is the problem of why we observe single definite outcomes when we do experiments on quantum systems. In the basic math of QM, this is simply put in by hand as the projection postulate: there is nothing anywhere else in the math that predicts it. The rest of the math says that when you do an experiment to make a measurement you entangle the system being measured with the measuring device and end up with a superposition of different possible outcomes. The only way to get a single definite outcome out of that in the basic math is to put in the projection postulate by hand--i.e., just declare by fiat that we collapse the wave function in the math whenever we have to to make predictions for future experimental results come out right.

On a collapse interpretation, the projection postulate becomes an actual physical law and wave function collapses become actual physical events (instead of just mathematical devices to make correct predictions for future experimental results). But then they have to explain how these collapse events happen and how correlations between spacelike separated measurements on entangled particles can violate the Bell inequalities without actual faster than light signaling.

On a no collapse interpretation like the MWI, the projection postulate remains just a mathematical device, and the rationale for applying it is that, once decoherence happens after a measurement, the different branches of the wave function can never interact with each other again, so in each individual branch we can apply the projection postulate to get an "effective" wave function for that branch that works for predicting future measurement results in that branch, even though we "know" (if we accept the MWI as true) that there are other branches in the overall wave function. But this requires one to accept all of the other things that come along with the MWI and all of the other issues that have been raised with it.

On an interpretation like the Bohmian interpretation, while collapse of the wave function is not an actual physical process (the full wave function is always there), measurements have single outcomes because those outcomes are determined by the underlying, unobservable particle positions, and each particle always has a single definite position. But the equation of motion for these definite particle positions is highly nonlocal, which many people find very difficult to accept. Also, on this interpretation, when you dig into the details of how measurements of anything other than position actually work, you realize that what you are actually measuring when you think you are measuring, say, the spin of an electron, looks nothing like what you expect a spin measurement to look like.
 
  • Like
Likes Lord Jestocost and timmdeeg
  • #32
PeterDonis said:
On an interpretation like the Bohmian interpretation, while collapse of the wave function is not an actual physical process (the full wave function is always there), measurements have single outcomes because those outcomes are determined by the underlying, unobservable particle positions, and each particle always has a single definite position.
I think the "unobservable" is misleading, given that all what we see are those positions - of macroscopic devices, but those macroscopic devices consist of particles as well.
PeterDonis said:
But the equation of motion for these definite particle positions is highly nonlocal, which many people find very difficult to accept. Also, on this interpretation, when you dig into the details of how measurements of anything other than position actually work, you realize that what you are actually measuring when you think you are measuring, say, the spin of an electron, looks nothing like what you expect a spin measurement to look like.
Given the violation of Bell's inequalities, "nonlocality" (which is only non-Einstein-causality) is simply the explanation which is most compatible with common sense. Everything else - giving up realism (the extremely weak EPR criterion) and causality (in particular necessarily Reichenbach's common cause principle - astrologers will be happy) would be no alternative in normal life. So why should it accepted as making sense in science?

That "measurement" is a misleading word has been criticized already by Bell in the paper "against measurement". The "measurement result" is the result of an interaction, not a property of one of the interacting systems.
 
  • Like
Likes WernerQH
  • #33
WernerQH said:
It is important to distinguish the wave function of the alpha-particle from the wave function in the 3 * (N+1) dimensional configuration space of the "whole" system including the gas of the cloud chamber. Tracing out the uninteresting coordinates of the gas molecules is certainly a non-unitary operation, and the result for the alpha-particle would again have spherical symmetry (and it would be misleading to still call it a "wave function").

What the formalism tells us is that the drops are vastly more likely to form straight lines than erratic zig-zag curves. Similarly the Feynman path integral tells us that the points at which the alpha-particle interacts are likely to be very close to a path that has the least action. Wave function collapse is just a figure of speech that has no basis in the formalism.
Thanks, great explanation.
Could one - to simplify a little - say that the straight lines are based on the equations of motion? But then this result can be expected. I was wondering why they call it "a highly non trivial result without having any recourse to the wave packet reduction rule" .
 
  • #34
PeterDonis said:
You've been here long enough to know not to use Wikipedia as a reference, particularly for a topic like this. :wink:
Yes, therefore I wrote "Wikipedia says simply". :wink:

PeterDonis said:
The measurement problem is the problem of why we observe single definite outcomes when we do experiments on quantum systems. In the basic math of QM, this is simply put in by hand as the projection postulate: there is nothing anywhere else in the math that predicts it. The rest of the math says that when you do an experiment to make a measurement you entangle the system being measured with the measuring device and end up with a superposition of different possible outcomes. The only way to get a single definite outcome out of that in the basic math is to put in the projection postulate by hand--i.e., just declare by fiat that we collapse the wave function in the math whenever we have to to make predictions for future experimental results come out right.

On a collapse interpretation, the projection postulate becomes an actual physical law and wave function collapses become actual physical events (instead of just mathematical devices to make correct predictions for future experimental results). But then they have to explain how these collapse events happen and how correlations between spacelike separated measurements on entangled particles can violate the Bell inequalities without actual faster than light signaling.

On a no collapse interpretation like the MWI, the projection postulate remains just a mathematical device, and the rationale for applying it is that, once decoherence happens after a measurement, the different branches of the wave function can never interact with each other again, so in each individual branch we can apply the projection postulate to get an "effective" wave function for that branch that works for predicting future measurement results in that branch, even though we "know" (if we accept the MWI as true) that there are other branches in the overall wave function. But this requires one to accept all of the other things that come along with the MWI and all of the other issues that have been raised with it.
So in the MWI the projection postulate isn't in conflict with unitarity (though "put in by hand") because the wavefunction doesn't collapse. It confirms nothing more than what we observe, single definite outcomes".

Having searched the web I couldn't find any comparable comprehensive form explaining the meaning of the "Measurement Problem". Perhaps you should think to add it to the FAQ list.

Thanks!
 
  • #35
timmdeeg said:
Could one - to simplify a little - say that the straight lines are based on the equations of motion? But then this result can be expected.
What is non-trivial depends very much on how you look at it. :-)
The equations of motion can be derived from the action principle, so it's basically the same argument.

I think the wave function (whether in 3 or more dimensions) is a red herring. The Heisenberg picture offers a more sensible view of the quantum formalism: the states are constant. There's no mention of collapse. A |ket> by itself is meaningless; it always has to be combined with a <bra| and a trace be taken. One always considers ensembles, and nobody has ever suggested that an operator, rather than a wave function, represents an individual system. In the Heisenberg picture you can consider the evolution over a certain period of time and compute expectation values of operator products at different times. Averages, and the likelihood of particular histories are the natural output of the formalism. Nobody has to clarify which "quantum state" the system is "really" in at any particular moment. Quantum theory is silent on that.
 
  • Like
Likes timmdeeg
  • #36
Sunil said:
I think the "unobservable" is misleading, given that all what we see are those positions - of macroscopic devices
Exactly: we see positions of macroscopic devices, not positions of individual particles. (Btw, the "particles" in question aren't necessarily any of the particles that appear in our fundamental theories--they're not necessarily quarks or leptons. They could be some other kind of particles at a deeper level.) The individual particle positions, which are the basic ontology of the Bohmian interpretation, are unobservable.
 
  • #37
The only theory that unites the quantum world with the "classical" scale is QFT. Out of it, it's known that the world is not made of classical objects but of the 18 quantum fields. The fields are the fundamental nature of reality. At least as far as science goes.
Gravity isn't expected to change the fields nature of reality either.
At least we now know how the world isn't.

To tackle the enigma of classical perception, physicists have conjured up fantastic ontologies: higher-dimensional space-time or the multiverse, in which our universe is just one instance out of an infinitude. Other physicists have resorted to mysticism. There are more ontological questions than answers, but it's better to not know than be fooled.
 
Last edited:
  • Like
  • Skeptical
Likes AlexCaledin and Motore
  • #38
This article seems to question the universal validity of the projection postulate.

If correct would this affect the interpretations of QM?

https://link.springer.com/article/10.1007/s10701-021-00452-x

"Specifically, quantum computing algorithms make heavy use of the projection postulate [2], the axiom that every measurement is strictly equivalent to random application of one of a set of mathematical projection operators, with probability governed by the Born rule.
...
So, is the projection postulate or any related measurement axiom fundamentally and literally true if you look closely enough? In this paper, I will attempt to analyze the internal dynamics of a specific real single-quantum detector, the cloud chamber.
...
I have formulated a mechanism for how the Hamiltonian structure of quantum decay, the physics of droplets in supersaturated vapors, and the mathematics of quantum Coulomb scattering from degenerate states can together account for the observed phenomenology of track origination in cloud chambers, without having to invoke measurement axioms."
 
  • Like
Likes gentzen
  • #39
timmdeeg said:
This article seems to question the universal validity of the projection postulate.

If correct would this affect the interpretations of QM?
Hard to say. Some physicists reject the projection postulate (as generally valid), but don't expect this to be a big deal:
vanhees71 said:
Where I strongly differ with the orthodox/minimal view (aka the 7 rules agreed on by the majority in this forum) is only in refusing the collapse/projection postulate as a fundamental generally valid postulate.
Arnold Neumaier's thermal interpretation also comes to the conclusion that Born's rule (and the projection postulate) are not universally valid, but seems to attach more importance to this:
A. Neumaier said:
Then I prove that under certain other circumstances and especially for ideal binary measurements (rather than assume that always, or at least under unstated conditions), Born's interpretation of the formal Born rule as a statistical ensemble mean is valid. Thus I recover the probabilistic interpretation in the cases where it is essential, and only there, without having assumed it anywhere.
Maybe the attached importance is more related to the Born rule itself, than to the projection postulate.

Edit: I should probably clarify that my answer just tries to point out that practical implications of the non-universality of the projection postulate will be very limited, because it is well known already that you should not interpret it too literally. So my guess is that the implications for quantum computing will be minimal. For interpretations on the other hand, stressing its non-universality might be important.
 
Last edited:
  • Like
Likes timmdeeg
  • #40
For me the question whether or not the Born rule can be derived from other postulates is secondary. You can come to the same mathematical formalism via different heuristic routes. I'm not too convinced by the alternative @A. Neumaier calls "thermal interpretation".

What's for sure clear is that the projection postulate is not needed and in almost all real-world experiments not followed or not feasible. It's only a special case of a particular sort of preparation procedure where it is possible to prepare a system in a pure quantum state by measurement of a certain observable (or a set of compatible observables) to a resolution such that all but one outcome are filtered away. This is possible e.g., for the Stern-Gerlach experiment, where a beam of silver atoms is split in two entangeling the spin component chosen by the direction of the magnetic field with the position almost ideally and thus being able to prepare a pure spin-component eigenstate. Another example are polarization states of single photons just using a good polarization filter. It depends on the feasibility of such a high-resolution measurement and filtering, whether you can prepare a certain given pure quantum state, and most measurements are far from this. I don't think that such an exceptional case should be taken as one of the fundamental postulates of a general physical theory.
 
  • Like
Likes timmdeeg and gentzen
  • #41
gentzen said:
Hard to say. Some physicists reject the projection postulate (as generally valid), but don't expect this to be a big deal:
The projection postulate is not an interpretation of QM, but which importance this postulate has for a physicist perhaps depends on the interpretation he favors.
 
  • #42
vanhees71 said:
What's for sure clear is that the projection postulate is not needed and in almost all real-world experiments not followed or not feasible.
Would you say that this conclusion is interpretation independent?
 
  • #43
timmdeeg said:
Would you say that this conclusion is interpretation independent?


It's standard quantum mechanics. Some people don't like it because it leads to the measurement paradox and the issue with the cat.

I think it's the most revealing aspect of all QT.
 
Last edited:
  • Like
Likes timmdeeg
  • #44
timmdeeg said:
I have formulated a mechanism for how the Hamiltonian structure of quantum decay, the physics of droplets in supersaturated vapors, and the mathematics of quantum Coulomb scattering from degenerate states can together account for the observed phenomenology of track origination in cloud chambers, without having to invoke measurement axioms."
The mechanism described appears to me to be similar to Neumaier's thermal interpretation, which was mentioned in an earlier post. Basically, the mechanism is random variation in the huge number of degrees of freedom of the detector--in this case the molecules in the chamber--which leads to one particular direction for the cloud chamber track being selected out of all the possible ones.
 
  • Like
Likes timmdeeg
  • #45
timmdeeg said:
Would you say that this conclusion is interpretation independent?
Yes, you only have to look, how in practice QT is applied to describe and predict the outcome of real-world measurements within QT.
 
  • Like
Likes gentzen and timmdeeg
  • #46
PeterDonis said:
The mechanism described appears to me to be similar to Neumaier's thermal interpretation, which was mentioned in an earlier post. Basically, the mechanism is random variation in the huge number of degrees of freedom of the detector--in this case the molecules in the chamber--which leads to one particular direction for the cloud chamber track being selected out of all the possible ones.
This has been discussed already in 1929 by Mott in a famous article about, why one sees "straight trajectories" of ##\alpha## particles emitted from a radioactive probe within a cloud chamber. The probability for emission of any individual ##\alpha## particle is random in its direction (in good approximation it's isotropic) and the magnitude of the momentum is determined from the energy of the emitted ##\alpha## particle which is determined at an accuracy which can be estimated by the lifetime-energy uncertainty relation. Mott shows that once the direction of the ##\alpha## particle is given after being emitted by the ionization of the first few droplets of the vapor in the cloud chamber, the probability for ionizing the next droplet in the cloud chamber is sharply peaked around the straight line. It's of course clear that the "trajectory" is never as accurately determined as it would violate the Heisenberg uncertainty relation. This follows without any fancy interpretations from the application of the minimal interpretation as given in the Insights article (you don't even need the projection postulate, which however in this case holds to a pretty good approximation).
 
  • Like
Likes AlexCaledin
  • #47
vanhees71 said:
What's for sure clear is that the projection postulate is not needed and in almost all real-world experiments not followed or not feasible.
Yes. What is universally true (and therefore should replace the projection postulate and Born's rule associated with it) is the more general POVM view. Simple, elementary foundations for it are given in my paper 'Born's rule and measurement' (arXiv:1912.09906).
vanhees71 said:
I'm not too convinced by the alternative @A. Neumaier calls "thermal interpretation".
But as shown in the above paper, the thermal interpretation matches perfectly with the POVM view.
 
  • Like
Likes gentzen and vanhees71
  • #48
Well, as you well know, I've no objections of your formalism, but more when it comes to interpretational issues. What I don't like is that it is not clear, which meaning what you call "expectation values" have. As you want to derive the Born rule, I cannot read it in its usual meaning, namely a probabilistic expectation value. On the other hand the idea that this is actually the "observable" is also not convincing, because that may be true in a "thermal sense", i.e., when you consider macroscopic observables, where the fluctuations are "negligibly small" because you "coarse grain" over large enough space-time volumes, and in this sense your interpretation is indeed really "thermal", but it doesn't apply to microscopic objects, for which we want to use and interpret quantum theory.

That's why I still think from a physicist's point of view the "orthodox minimal interpretation" (no collapse, no quantum-classical cuts on a fundamental level but probabilities and only probabilities a la Born with Born's rule itself a fundamental postulate) is the most "economic approach" to state the scientific part of quantum theory (the only part which in my opinion belongs to physics and not metaphysics).

I'll have a look at your paper as soon as I find the time :-(.
 
  • #49
vanhees71 said:
That's why I still think from a physicist's point of view the "orthodox minimal interpretation" (no collapse, no quantum-classical cuts on a fundamental level but probabilities and only probabilities a la Born with Born's rule itself a fundamental postulate) is the most "economic approach" to state the scientific part of quantum theory (the only part which in my opinion belongs to physics and not metaphysics).
In his book "Einstein's Schleier" Zeilinger says analogously (? sinngemäß) it is sufficient to understand the wave function just as a mental construct so that its collapse doesn't happen in real space. I was never sure if that is his personal view. It seems to fit though to that what you call "orthodox minimal interpretation", right?
 
  • Like
Likes AlexCaledin and vanhees71
  • #50
I think so. I've also read this book, and it's always good to have the view of an experimentalist. I've always talked briefly with Zeilinger after a colloquium some years ago, and there also he told me he's pretty much a "Bohrian Copenhagenian". AFAIK also in his scientific papers, he's pretty much an "orthodox minimal interpreter".
 
  • Like
Likes AlexCaledin and timmdeeg
Back
Top