Nature Physics on quantum foundations

Answers and Replies

  • #2
Already the 1st paragraph tells me why the philosophical part of what they call "quantum foundations" really is pretty questionable.

Just 2 simple points:

Quantum theory is NOT weird but the most comprehensive theory about Nature we have today.

Wave-particle duality is no phenomenon but a theoretical concept that's outdated for about 100 years.
 
  • Like
  • Love
Likes Paul Colby, bhobba, physicsworks and 1 other person
  • #3
I share the view that quantum foundations are important, but I admit I share Vanhees impression that some of the things they raise are old things, somewhat outdated. That's not to say qm foundations are not important, I just don't see any of the modern arguments in that particular paper which seems unfair.

How about just holding back the judgement and the dismissal of foundational research until we have included gravity properly in our theory. It remains to see if the QFT framework (and indirectly QM foundations) comes out ontop or not.

/Fredrik
 
  • Like
Likes hutchphd and vanhees71
  • #4
The editors of high impact journal Nature Physics explain why the field of quantum foundations is important for physics.
https://www.nature.com/articles/s41567-022-01766-x
I agree that there are loose ends. But I fail to see much progress:
Although a fresh view can invigorate any field, much of this work also manifests a disregard for the progress that has been made since quantum mechanics was established. The quantum foundations literature is the product of decades of careful thought about the issues involved in understanding and interpreting the physical world. As with any topic, a failure to constructively engage with existing work runs the risk of repeating earlier mistakes and misunderstandings.
Nothing fundamental has changed in the way we do quantum mechanical calculations. Now we have just more no-go theorems, peculiar inequalities, and a proliferation of interpretations, rather than having found the one and only natural interpretation of quantum theory. Most of the work on quantum foundations seems to have explored blind alleys. Much like the search for a mechanical model of the ether, which once was an obvious and important field of research. Maxwell himself might have realized that there is no need for an ether, if he had had more time. For his contemporaries electrodynamics (or rather the ether) had some "weird" features, that took four decades to get rid of. In the case of quantum theory it seems to take significantly longer to remove superfluous metaphysical baggage.

I think it is not quantum theory that is weird, but the way it is phrased / taught:
But the particle can at the same time be entangled with another particle located elsewhere such that the outcome of measuring one particle determines the state of the other.
The particle concept is problematic, and even talk about quantum particles doesn't remove its misleading connotations. My view is that quantum theory makes much more sense if it is formulated without reference to "particles" and "measurements".
 
  • Like
Likes dextercioby, martinbn and vanhees71
  • #5
My view is that quantum theory makes much more sense if it is formulated without reference to "particles" and "measurements".
How do you formulate the Born rule in different bases (say position basis and momentum basis) without measurements?
 
  • Like
Likes physika, vanhees71 and gentzen
  • #6
How do you formulate the Born rule in different bases (say position basis and momentum basis) without measurements?
Quantum theory makes predictions about events. It is customary to compute a probability amplitude and take the squared modulus as an extra step. But this dichotomy of unitary evolution and "measurements" is artificial. Schwinger fused them in a coherent (closed time path) formalism a long time ago. In his generalized quantum action principle the Born rule is built-in:
[...] if a system is suitably perturbed in a manner that depends upon the time sense, a knowledge of the transformation function referring to a closed time path determines the expectation value of any desired physical quantity [...]
[J.Math.Phys. 2, 407-432, (1960)]
 
  • #7
But this dichotomy of unitary evolution and "measurements" is artificial.
Agreed, but when Schwinger talks about "perturbed" system in your quote above, doesn't he introduce a similar artificial dichotomy between perturbed and unperturbed?
 
  • Like
Likes physika and vanhees71
  • #8
This is just the introduction of the Schwinger-Keldysh time contour. It's a very important tool in non-equilibrium many-body QFT, but it's just standard QT. It's not adding anything to the interpretation than is contained in the minimal interpretation but is a mathematical technique to solve the equations of motion of Q(F)T.
 
  • #9
Agreed, but when Schwinger talks about "perturbed" system in your quote above, doesn't he introduce a similar artificial dichotomy between perturbed and unperturbed?
It is widely believed that QFT hinges on perturbation theory. But a more general viewpoint is that of the path integral as providing a generating functional from which all quantities of interest (especially correlations) can, in principle, be derived by simple differentiation.
 
  • #10
It is widely believed that QFT hinges on perturbation theory. But a more general viewpoint is that of the path integral as providing a generating functional from which all quantities of interest (especially correlations) can, in principle, be derived by simple differentiation.
In general, the correlation functions of observables at different times computed this way do not correspond to correlations observed by measuring observables at different times. This is because such a computation does not take into account a change of the state induced by measurement. This change of the state, known under the names information update, projection or collapse, is not a unitary transformation. How does Schwinger, or anyone else who uses a path integral formalism and claims that there is nothing special about measurements, account for this?
 
  • #11
I think it is not quantum theory that is weird, but the way it is phrased / taught:
Yes, I think one could also say that it's nature that is weird. When corroborated theory is seen from the outdated stance that nature is NOT weird, it really gets weird. We can resolve this mismatch if we adopt a weird take on nature :)

/Fredrik
 
  • #12
[...] such a computation does not take into account a change of the state induced by measurement. This change of the state, known under the names information update, projection or collapse, is not a unitary transformation.
I believe you have this backwards. What you are talking about is a theoretian's idealized measurement that has little to do with real experiments. Applied to a harmonic oscillator, a position measurement would imply adding an infinite amount of kinetic energy, if the imagined collapsed state really were an eigenstate of the position operator. Contrariwise, there is little reason to doubt that the correlation function ## \langle x(t)x(0) \rangle ## computed for an unperturbed oscillator is a useful first approximation to measurement results. And of course this first approximation can be improved by including perturbing effects (as done by Schwinger).
 
  • #13
I'd say, "measurement" is mere heuristics to justify how we compute quantum mechanical correlation functions. It shouldn't be thought of as a separate process. I think trying to "solve the measurement problem" is akin to the search for a "mechanical model" of the ether.
 
  • #14
Wave-particle duality is no phenomenon but a theoretical concept that's outdated for about 100 years.
I am not attempting to contribute to this discussion-above my competence , but could I perhaps ask where I might read up on this particular point?

I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.

Maybe you could give me some kind of a pointer so that I could perhaps understand the point you are making?

(quite probably I have misunderstood ...)
 
  • #15
I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.
That's also my view (that it's mainstream), although I wouldn't call it an explanation but rather a mere statement of the fact that the so-called quantum objects are neither particles nor waves, but share properties of both.

@vanhees71 is tired of explaining that photons are not "little bullets" (and he's right!)
I think his statement is a little exaggerated. :smile:
 
  • #16
What you are talking about is a theoretian's idealized measurement that has little to do with real experiments.
It has absolutely nothing to do with idealization. Even realistic POVM measurements involve a non-unitary change of state upon measurement. Or do you disagree?
 
  • #17
It has absolutely nothing to do with idealization. Even realistic POVM measurements involve a non-unitary change of state upon measurement. Or do you disagree?
Also POVM measurements are an idealization. I think it's not useful to talk about the "state" of a "system" at a particular instant of time. (We integrate over all possible states when we use Schrödinger's equation.) What quantum theory predicts is the probabilities of particular sequences of events (or "histories", if you like).
 
  • #18
(If my question makes any sense)Is it impossible for any two elements of a quantum system to be at rest with respect to each other?

If so ,would that be a consequential observation?
 
  • #19
I am not attempting to contribute to this discussion-above my competence , but could I perhaps ask where I might read up on this particular point?

I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.

Maybe you could give me some kind of a pointer so that I could perhaps understand the point you are making?

(quite probably I have misunderstood ...)
The wave-particle duality was an important heuristic idea before modern quantum mechanics has been discovered in 1925/26, and the main protagonists (among them Einstein and de Broglie) were well aware of the problems with this idea. It was solved with Born's probabilistic interpretation of the quantum states, i.e., that the modulus squared of the Schrödinger wave function is a probability distribution for the position of a particle, described by this wave function.
 
  • #20
That's also my view (that it's mainstream), although I wouldn't call it an explanation but rather a mere statement of the fact that the so-called quantum objects are neither particles nor waves, but share properties of both.

@vanhees71 is tired of explaining that photons are not "little bullets" (and he's right!)
I think his statement is a little exaggerated. :smile:
Photons are of course another complication. For photons you cannot even define a position observable to begin with, and the only (very) successful forumulation of relativistic QT is local (microcausal) relativistic QFT. For a very clear discussion, see

B. Gin-ge Chen et al (ed), Quantum field theory lectures of Sidney Coleman, World Scientific (2019)
 
  • #21
What quantum theory predicts is the probabilities of particular sequences of events (or "histories", if you like).
Are you suggesting that consistent histories interpretation is the way to go, is it how you avoid a reference to measurement?

Note that the usual formula for probability of a history involves products of unitary evolution operators and non-unitary projectors, which in the standard interpretation is interpreted in terms of a series of projections induced by measurements at different times.
 
Last edited:
  • #22
Are you suggesting that consistent histories interpretation is the way to go, is it how you avoid a reference to measurement?
I'm not a fan of consistent/decoherent histories. My preference is a blend of the statistical and transactional interpretations, and ... GRW. At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real. A short coordinated "wiggling" of electrons in a detector, for example, constitutes what we could call the measurement of the polarization of a photon.
Note that the usual formula for probability of a history involves products of unitary evolution operators and non-unitary projectors, which in the standard interpretation is interpreted in terms of a series of projections induced by measurements at different times.
That's what I dislike about consistent histories: preferred variables / frameworks. It is too non-committal about the actual ontology, about what really happens. In the Schwinger-Keldysh formalism the "series of projections" arises naturally through creation and annihilation operators at adjacent points of the forward and backward time branch. For Schwinger the closed time path (with a "backward" branch) may have been a purely formal device, but I think that the backwards running time has physical significance, that physical events do occur in close pairs, that there are two world-sheets with opposite sense of time tacked together. The forward evolution of a ## \ket{\text{ket}} ## according to ## e^{-iHt} ## is only half the story. The Schwinger-Keldysh formalism includes the other half and the Born rule as well.
 
  • #23
I'm not a fan of consistent/decoherent histories.
Thanks, I already feared that Morbert or me would now have to clarify which of the things you mentioned must be interpreted differently from the histories perspective.

At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real.
... preferred variables / frameworks. It is too non-committal about the actual ontology, about what really happens.
For Schwinger the closed time path (with a "backward" branch) may have been a purely formal device, but I think that the backwards running time has physical significance, that physical events do occur in close pairs, that there are two world-sheets with opposite sense of time tacked together.
Sure, events are nice as ontology, because they provide a clear connection to spacetime, and are somewhat minimal. However, coherent states could be nice too, because they provide a clear connection to classical mechanics, and are "more compatible" with decoherence than purely spacetime based ontologies.

However, the reason why I comment on your choice of events as the preferred ontology is that I recently read the SEP article on Relational Quantum Mechanics (from Winter-2021). I was surprised, that it contained statements like "Facts as this one (“the particle is at x at time t”) are called “events”, or “quantum events”. Quantum theory is about events." I vaguely remembered having read Rovelli's paper from 1996, and that he was much more non-commital about the actual ontology back then. And after reading the section on the Frauchiger-Renner thought-experiment, I decided that Rovelli probably had accidentally overcommited to a specific ontology now (no idea when he changed his mind), and that this "overcommited" ontology will most probably be inconsistent.

Recently I read a comment on an article attacking that "new" ontology:
This paper refers to the version of RQM that existed before the introduction of "cross-perspective links" in arXiv:2203.13342, a change that amounts to saying, "Well, we didn't want all those 'relative facts' anyway."
I have no idea whether my "interpretation of the events" has anything to with what really happened.

But those events might still provide a "slight warning" about being too commital to your ontology, just because it makes sense to you and you intuitively like it.
 
  • Like
Likes physika and PeroK
  • #24
However, the reason why I comment on your choice of events as the preferred ontology is that I recently read the SEP article on Relational Quantum Mechanics (from Winter-2021). I was surprised, that it contained statements like "Facts as this one (“the particle is at x at time t”) are called “events”, or “quantum events”. Quantum theory is about events."
RQM never had much appeal to me. I find "the particle is at x at time t” too unspecific a characterization of an event. The events that I have in mind are interactions of electrons and photons, for example. Through the fluctuation/dissipation theorem the photon emission rate in a medium can be expressed in terms of a Fourier integral over the current density fluctuations, hinting (to me, at least) at the possibility that the emission of a photon relates in the real world to two closely spaced, short-lived currents.

But those events might still provide a "slight warning" about being too commital to your ontology, just because it makes sense to you and you intuitively like it.
However, coherent states could be nice too, because they provide a clear connection to classical mechanics, and are "more compatible" with decoherence than purely spacetime based ontologies.
Thanks for your warning. But I can't make sense of coherent states as something "real" - it's more a piece of mathematics (holomorphic representation) to me. As to the connection to the classical world I think that the continuous world lines of classical particles have to be replaced in the quantum picture by dotted lines, the incessant interactions of an electron with the electromagnetic field. The world looks classical when you reduce the time resolution.
 
  • #25
Thanks for your warning. But I can't make sense of coherent states as something "real"
Oh, sorry. I should have made clearer what I meant by the warning, and what I meant by "overcommited" ontology. Events or flashes (like in GRW) are fine as ontology, as long as there are not "too many" (or as long as they are not "too precise"). Basically Bohmian mechanics is the only interpretation known to me that managed to have a maximally refined microscopic ontology without getting inconsistent or making different predictions from standard QM.

It is probably easier to see for GRW: The flashes are OK as ontology, but you can only have very few of them. But if you believe that you found "the right ontology", and you are unaware of the problems other interpretations encountered before, then you might believe that this is enough, so you go to a maximally refined ontology (because that removes vagueness), and thereby run into trouble.
 
  • #26
The flashes are OK as ontology, but you can only have very few of them.
Why is that? I would expect them to occur at the zeptosecond scale (## 10^{-21} {\rm s} ##), corresponding to the electron mass. I looked at two papers by Tumulka, but they were quite different from what I have in mind. What kind of trouble do you anticipate?
 
  • #27
Sure, events are nice as ontology, because they provide a clear connection to spacetime, and are somewhat minimal.
I am not sure I understand Werners line of thinking. I assume you both by event, refers to a "4D spacetime" event?

I just ask because I myself entertain the concept of an abstract detection event in the sense that an "observer/agen/IGUS" registers a distinguishable event. But where the structure of this abstract set of events (such as dimensionality of the set) are not necessarily primary, or assume, but instead emergent(say as chaos dimension complexity etc). Those type of "events" are in fact information updates, or elementa of measurements, and are the type of ontology I often think of.

/Fredrik
 
  • #28
Why is that? I would expect them to occur at the zeptosecond scale (## 10^{-21} {\rm s} ##), corresponding to the electron mass. I looked at two papers by Tumulka, but they were quite different from what I have in mind. What kind of trouble do you anticipate?
If you add a "non-emergent" ontology, then you risk getting a different theory, making "slightly" different predictions. Why? Probably because an ontology risks to make everything too exact, and to remove too much vagueness. Bohmian mechanics has its equilibrium distribution, to bring back the vagueness. Without something similar, you are often forced to "thin out" your ontology, to avoid making experimentally verifiable predictions that differ from standard QM.
 
  • Like
Likes mattt and PeroK
  • #29
Probably because an ontology risks to make everything too exact, and to remove too much vagueness.
We are probably talking past each other. Why should vagueness be an important ingredient? For me, QED is a fundamentally statistical theory. Does randomness constitute enough "vagueness"? I don't want to create a new theory. I think QED is perfect, and I only aim to see it more clearly as a theory of a special kind of point process in spacetime (actually two five-dimensional manifolds glued together - I'm lacking the proper mathematical term).
 
  • #30
The editors of high impact journal Nature Physics explain why the field of quantum foundations is important for physics.
https://www.nature.com/articles/s41567-022-01766-x
So much, said so well, and yet, with very little in the way of a bottom line or an action item. Mostly, what keeps it a clean and elegant, but ultimately useless discussion, is that it doesn't engage with competing proposed resolutions of the open questions and what is at stake if one or another of them is adopted.
 
  • Like
Likes mattt, PeroK, Morbert and 1 other person
  • #31
Quantum theory is NOT weird but the most comprehensive theory about Nature we have today.
Not really a fair criticism.

The fact that a theory is comprehensive and accurate description of Nature doesn't preclude it from being weird.

"Weird" is a word generally used in reference to "common sense" and a typical person's intuition about how the world works. Weird is a close synonym of "counterintuitive."

Every shred of physical intuition gained from daily life, some of it hard wired into our brains, is based on a classical worldview that works for Newtonian mechanics and Maxwell's equations, but fails miserably in the domain of quantum theory in half a dozen or so distinct respects. Yet many people aren't even seriously exposed to quantum theory for the first time until they are college students with STEM majors (assuming that they ever go to college and that they ever major in STEM).

Calling quantum theory "unnatural" would indeed have been inappropriate, but "weird" is right on target.
 
  • #32
"It is easy to dismiss research into the foundations of quantum mechanics as irrelevant to physicists in other areas. Adopting this attitude misses opportunities to appreciate the richness of quantum mechanics."
I'm ok with foundations work being irrelevant to most physicists. My interest in foundations wasn't because I thought foundations work could invigorate other research projects. Almost the opposite: I became interested in foundations in the hope that it could better articulate how physicists and chemists already understand quantum mechanics.

When QM is introduced axiomatically in undergrad, it's usually in terms of some procedure of unitary evolution + measurement + state reduction. But huge areas of physics and chemistry make use of QM without this procedural approach, and alternative interpretations can better overlap with these areas. (E.g. I find consistent histories interesting because it nicely complements the way many of us already think about electron transport calculations in transistors)
 
  • #33
We are probably talking past each other.
That is quite possible. My reaction to your "intended interpretation" was dominated by your reference to GRW (paired with my limited knowledge of why such objective collapse theories differ from standard QM):
My preference is a blend of the statistical and transactional interpretations, and ... GRW. At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real. A short coordinated "wiggling" of electrons in a detector, for example, constitutes what we could call the measurement of the polarization of a photon.
GRW is known to allow both a "flash ontology" (or "flashes ontology") and a "mass density ontology". (I guess that some objective collapse theories are commited to a "mass density ontology", i.e. theories like the Diósi–Penrose model.) So I guessed that your "events" would be similar to the "flashes" for GRW.

Why should vagueness be an important ingredient? ... Does randomness constitute enough "vagueness"?
The "flashes" for GRW have infinitely accurate spacetime coordinates. For GRW, even their randomness seems to be not enough to get rid of that excess accuracy again. But for Bohmian mechanics, the randomness is sufficient, so an unconditionally true answer to that question seems impossible.

The trouble with excess accuracy is that the information content of a system with a finite energy in a finite spacetime region should better not be infinite. It is convenient to work with real numbers for mathematical models, but their infinite accuracy forces you to have some mechanism (like "vagueness") to avoid that their infinite accuracy has experimentally observable consequences.

For me, QED is a fundamentally statistical theory. ... I don't want to create a new theory. I think QED is perfect, and I only aim to see it more clearly as a theory of a special kind of point process in spacetime
I guess that the mechanism for QFT to get rid of the excess accuracy (related to "point process in spacetime") is renormalization. You find "points of view" like the following in modern QTF1 lecture notes:
Many points of view; one is that it is our own fault: QFT is somewhat idealised; it assumes infinitely extended fields (IR) with infinite spatial resolution (UV); there is no wonder that the theory produces infinities. Still, it is better to stick to idealised assumptions and live with infinities than use some enormous discrete systems (actual solid state systems).

There is also a physics reason why these infinities can be absorbed somehow: Our observable everyday physics should neither depend on how large the universe actually is (IR) nor on how smooth or coarse-grained space is (UV).
 
  • #34
Every shred of physical intuition gained from daily life, some of it hard wired into our brains, is based on a classical worldview that works for Newtonian mechanics and Maxwell's equations
Hmmm... I don't think it's how my brain is wired. My brain, and I think yours, makes use of inferences, abduction, lossy retention and actions influenced by subjective expectations that has been tuned by evolution, even though we may not think of it. These things are IMO in excellent harmony with quantun weirdness if you only embrace the inside observer view 🙄 So I see good reasons why we WILL ultimately see how natural QM is, and we will look back and wonder how Newtons mechanics ever made sense 😬

/Fredrik
 
Last edited:
  • #35
The trouble with excess accuracy is that the information content of a system with a finite energy in a finite spacetime region should better not be infinite. It is convenient to work with real numbers for mathematical models, but their infinite accuracy forces you to have some mechanism (like "vagueness") to avoid that their infinite accuracy has experimentally observable consequences.
I don't subscribe to the view that information is physical. It belongs to our theories and models. And we can safely ignore excess digits.
I guess that the mechanism for QFT to get rid of the excess accuracy (related to "point process in spacetime") is renormalization.
Yes, it is useful (no, necessary!) to ignore the scales that are not relevant. We couldn't do physics otherwise.
You find "points of view" like the following in modern QTF1 lecture notes:
Thank you. Your posts always contain interesting pointers.
 

Suggested for: Nature Physics on quantum foundations

Replies
5
Views
764
Replies
12
Views
1K
Replies
278
Views
6K
Replies
61
Views
3K
Replies
90
Views
5K
Replies
42
Views
4K
Replies
0
Views
3K
Replies
16
Views
3K
Back
Top