- #1

- 13,649

- 6,055

https://www.nature.com/articles/s41567-022-01766-x

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- I
- Thread starter Demystifier
- Start date

- #1

- 13,649

- 6,055

https://www.nature.com/articles/s41567-022-01766-x

- #2

- 22,815

- 13,731

Just 2 simple points:

Quantum theory is NOT weird but the most comprehensive theory about Nature we have today.

Wave-particle duality is no phenomenon but a theoretical concept that's outdated for about 100 years.

- #3

- 3,934

- 521

How about just holding back the judgement and the dismissal of foundational research until we have included gravity properly in our theory. It remains to see if the QFT framework (and indirectly QM foundations) comes out ontop or not.

/Fredrik

- #4

- 411

- 285

I agree that there are loose ends. But I fail to see much progress:

https://www.nature.com/articles/s41567-022-01766-x

Nothing fundamental has changed in the way we do quantum mechanical calculations. Now we have just more no-go theorems, peculiar inequalities, and a proliferation of interpretations, rather than having found the one and only natural interpretation of quantum theory. Most of the work on quantum foundations seems to have explored blind alleys. Much like the search for a mechanical model of the ether, which once was an obvious and important field of research. Maxwell himself might have realized that there is no need for an ether, if he had had more time. For his contemporaries electrodynamics (or rather the ether) had some "weird" features, that took four decades to get rid of. In the case of quantum theory it seems to take significantly longer to remove superfluous metaphysical baggage.Although a fresh view can invigorate any field, much of this work also manifests a disregard for the progress that has been made since quantum mechanics was established. The quantum foundations literature is the product of decades of careful thought about the issues involved in understanding and interpreting the physical world. As with any topic, a failure to constructively engage with existing work runs the risk of repeating earlier mistakes and misunderstandings.

I think it is not quantum theory that is weird, but the way it is phrased / taught:

The particle concept is problematic, and even talk aboutBut the particle can at the same time be entangled with another particle located elsewhere such that the outcome of measuring one particle determines the state of the other.

- #5

- 13,649

- 6,055

How do you formulate the Born rule in different bases (say position basis and momentum basis) without measurements?My view is that quantum theory makes much more sense if it is formulated without reference to "particles" and "measurements".

- #6

- 411

- 285

Quantum theory makes predictions about events. It is customary to compute a probabilityHow do you formulate the Born rule in different bases (say position basis and momentum basis) without measurements?

[J.Math.Phys. 2, 407-432, (1960)][...] if a system is suitably perturbed in a manner that depends upon the time sense, a knowledge of the transformation function referring to a closed time path determines the expectation value of any desired physical quantity [...]

- #7

- 13,649

- 6,055

Agreed, but when Schwinger talks about "perturbed" system in your quote above, doesn't he introduce a similar artificial dichotomy between perturbed and unperturbed?But this dichotomy of unitary evolution and "measurements" is artificial.

- #8

- 22,815

- 13,731

- #9

- 411

- 285

It is widely believed that QFT hinges on perturbation theory. But a more general viewpoint is that of the path integral as providing a generating functional from which all quantities of interest (especially correlations) can, in principle, be derived by simple differentiation.Agreed, but when Schwinger talks about "perturbed" system in your quote above, doesn't he introduce a similar artificial dichotomy between perturbed and unperturbed?

- #10

- 13,649

- 6,055

In general, the correlation functions of observables at different times computed this way do not correspond to correlations observed by measuring observables at different times. This is because such a computation does not take into account a change of the state induced by measurement. This change of the state, known under the names information update, projection or collapse, is not a unitary transformation. How does Schwinger, or anyone else who uses a path integral formalism and claims that there is nothing special about measurements, account for this?It is widely believed that QFT hinges on perturbation theory. But a more general viewpoint is that of the path integral as providing a generating functional from which all quantities of interest (especially correlations) can, in principle, be derived by simple differentiation.

- #11

- 3,934

- 521

Yes, I think one could also say that it's nature that is weird. When corroborated theory is seen from the outdated stance that nature is NOT weird, it really gets weird. We can resolve this mismatch if we adopt a weird take on nature :)I think it is not quantum theory that is weird, but the way it is phrased / taught:

/Fredrik

- #12

- 411

- 285

I believe you have this backwards. What you are talking about is a theoretian's idealized measurement that has little to do with real experiments. Applied to a harmonic oscillator, a position measurement would imply adding an infinite amount of kinetic energy, if the imagined collapsed state really were an eigenstate of the position operator. Contrariwise, there is little reason to doubt that the correlation function ## \langle x(t)x(0) \rangle ## computed for an unperturbed oscillator is a useful first approximation to measurement results. And of course this first approximation can be improved by including perturbing effects (as done by Schwinger).[...] such a computation does not take into account a change of the state induced by measurement. This change of the state, known under the names information update, projection or collapse, is not a unitary transformation.

- #13

- 411

- 285

- #14

- 215

- 43

I am not attempting to contribute to this discussion-above my competence , but could I perhaps ask where I might read up on this particular point?Wave-particle duality is no phenomenon but a theoretical concept that's outdated for about 100 years.

I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.

Maybe you could give me some kind of a pointer so that I could perhaps understand the point you are making?

(quite probably I have misunderstood ...)

- #15

- 411

- 285

That's also my view (that it's mainstream), although I wouldn't call it an explanation but rather a mere statement of the fact that the so-called quantum objects are neither particles nor waves, but share properties of both.I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.

@vanhees71 is tired of explaining that photons are not "little bullets" (and he's right!)

I think his statement is a little exaggerated.

- #16

- 13,649

- 6,055

It has absolutely nothing to do with idealization. Even realistic POVM measurements involve a non-unitary change of state upon measurement. Or do you disagree?What you are talking about is a theoretian's idealized measurement that has little to do with real experiments.

- #17

- 411

- 285

Also POVM measurements are an idealization. I think it's not useful to talk about the "state" of a "system" at a particular instant of time. (We integrate over all possible states when we use Schrödinger's equation.) What quantum theory predicts is the probabilities of particular sequences of events (or "histories", if you like).It has absolutely nothing to do with idealization. Even realistic POVM measurements involve a non-unitary change of state upon measurement. Or do you disagree?

- #18

- 215

- 43

If so ,would that be a consequential observation?

- #19

- 22,815

- 13,731

The wave-particle duality was an important heuristic idea before modern quantum mechanics has been discovered in 1925/26, and the main protagonists (among them Einstein and de Broglie) were well aware of the problems with this idea. It was solved with Born's probabilistic interpretation of the quantum states, i.e., that the modulus squared of the Schrödinger wave function is a probability distribution for the position of a particle, described by this wave function.I am not attempting to contribute to this discussion-above my competence , but could I perhaps ask where I might read up on this particular point?

I was under the impression that the wave-particle duality was indeed a very strong mainstream scientific explanation and had never come across such a statement as yours up till now.

Maybe you could give me some kind of a pointer so that I could perhaps understand the point you are making?

(quite probably I have misunderstood ...)

- #20

- 22,815

- 13,731

Photons are of course another complication. For photons you cannot even define a position observable to begin with, and the only (very) successful forumulation of relativistic QT is local (microcausal) relativistic QFT. For a very clear discussion, seeThat's also my view (that it's mainstream), although I wouldn't call it an explanation but rather a mere statement of the fact that the so-called quantum objects are neither particles nor waves, but share properties of both.

@vanhees71 is tired of explaining that photons are not "little bullets" (and he's right!)

I think his statement is a little exaggerated.

B. Gin-ge Chen et al (ed), Quantum field theory lectures of Sidney Coleman, World Scientific (2019)

- #21

- 13,649

- 6,055

Are you suggesting that consistent histories interpretation is the way to go, is it how you avoid a reference to measurement?What quantum theory predicts is the probabilities of particular sequences of events (or "histories", if you like).

Note that the usual formula for probability of a history involves products of unitary evolution operators and non-unitary projectors, which in the standard interpretation is interpreted in terms of a series of projections induced by measurements at different times.

Last edited:

- #22

- 411

- 285

I'm not a fan of consistent/decoherent histories. My preference is a blend of the statistical and transactional interpretations, and ... GRW. At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real. A short coordinated "wiggling" of electrons in a detector, for example, constitutes what we could call the measurement of the polarization of a photon.Are you suggesting that consistent histories interpretation is the way to go, is it how you avoid a reference to measurement?

That's what I dislike about consistent histories: preferred variables / frameworks. It is too non-committal about the actual ontology, about what really happens. In the Schwinger-Keldysh formalism the "series of projections" arises naturally through creation and annihilation operators at adjacent points of the forward and backward time branch. For Schwinger the closed time path (with a "backward" branch) may have been a purely formal device, but I think that the backwards running time has physical significance, that physical events do occur in close pairs, that there are two world-sheets with opposite sense of time tacked together. The forward evolution of a ## \ket{\text{ket}} ## according to ## e^{-iHt} ## is only half the story. The Schwinger-Keldysh formalism includes the other half and the Born rule as well.Note that the usual formula for probability of a history involves products of unitary evolution operators and non-unitary projectors, which in the standard interpretation is interpreted in terms of a series of projections induced by measurements at different times.

- #23

Gold Member

- 557

- 449

Thanks, I already feared that Morbert or me would now have to clarify which of the things you mentioned must be interpreted differently from the histories perspective.I'm not a fan of consistent/decoherent histories.

At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real.

... preferred variables / frameworks. It is too non-committal about the actual ontology, about what really happens.

Sure, events are nice as ontology, because they provide a clear connection to spacetime, and are somewhat minimal. However, coherent states could be nice too, because they provide a clear connection to classical mechanics, and are "more compatible" with decoherence than purely spacetime based ontologies.For Schwinger the closed time path (with a "backward" branch) may have been a purely formal device, but I think that the backwards running time has physical significance, that physical events do occur in close pairs, that there are two world-sheets with opposite sense of time tacked together.

However, the reason why I comment on your choice of events as the preferred ontology is that I recently read the SEP article on Relational Quantum Mechanics (from Winter-2021). I was surprised, that it contained statements like "Facts as this one (“the particle is at x at time t”) are called “events”, or “quantum events”. Quantum theory is about events." I vaguely remembered having read Rovelli's paper from 1996, and that he was much more non-commital about the actual ontology back then. And after reading the section on the Frauchiger-Renner thought-experiment, I decided that Rovelli probably had accidentally overcommited to a specific ontology now (no idea when he changed his mind), and that this "overcommited" ontology will most probably be inconsistent.

Recently I read a comment on an article attacking that "new" ontology:

I have no idea whether my "interpretation of the events" has anything to with what really happened.This paper refers to the version of RQM that existed before the introduction of "cross-perspective links" in arXiv:2203.13342, a change that amounts to saying, "Well, we didn't want all those 'relative facts' anyway."

But those events might still provide a "slight warning" about being too commital to your ontology, just because it makes sense to you and you intuitively like it.

- #24

- 411

- 285

RQM never had much appeal to me. I find "the particle is at x at time t” too unspecific a characterization of an event. The events that I have in mind areHowever, the reason why I comment on your choice of events as the preferred ontology is that I recently read the SEP article on Relational Quantum Mechanics (from Winter-2021). I was surprised, that it contained statements like "Facts as this one (“the particle is at x at time t”) are called “events”, or “quantum events”. Quantum theory is about events."

But those events might still provide a "slight warning" about being too commital to your ontology, just because it makes sense to you and you intuitively like it.

Thanks for your warning. But I can't make sense of coherent states as something "real" - it's more a piece of mathematics (holomorphic representation) to me. As to the connection to the classical world I think that the continuous world lines of classical particles have to be replaced in the quantum picture byHowever, coherent states could be nice too, because they provide a clear connection to classical mechanics, and are "more compatible" with decoherence than purely spacetime based ontologies.

- #25

Gold Member

- 557

- 449

Oh, sorry. I should have made clearer what I meant by the warning, and what I meant by "overcommited" ontology. Events or flashes (like in GRW) are fine as ontology, as long as there are not "too many" (or as long as they are not "too precise"). Basically Bohmian mechanics is the only interpretation known to me that managed to have a maximally refined microscopic ontology without getting inconsistent or making different predictions from standard QM.Thanks for your warning. But I can't make sense of coherent states as something "real"

It is probably easier to see for GRW: The flashes are OK as ontology, but you can only have very few of them. But if you believe that you found "the right ontology", and you are unaware of the problems other interpretations encountered before, then you might believe that this is enough, so you go to a maximally refined ontology (because that removes vagueness), and thereby run into trouble.

- #26

- 411

- 285

Why is that? I would expect them to occur at the zeptosecond scale (## 10^{-21} {\rm s} ##), corresponding to the electron mass. I looked at two papers by Tumulka, but they were quite different from what I have in mind. What kind of trouble do you anticipate?The flashes are OK as ontology, but you can only have very few of them.

- #27

- 3,934

- 521

I am not sure I understand Werners line of thinking. I assume you both by event, refers to a "4D spacetime" event?Sure, events are nice as ontology, because they provide a clear connection to spacetime, and are somewhat minimal.

I just ask because I myself entertain the concept of an abstract detection event in the sense that an "observer/agen/IGUS" registers a distinguishable event. But where the structure of this abstract set of events (such as dimensionality of the set) are not necessarily primary, or assume, but instead emergent(say as chaos dimension complexity etc). Those type of "events" are in fact information updates, or elementa of measurements, and are the type of ontology I often think of.

/Fredrik

- #28

Gold Member

- 557

- 449

If you add a "non-emergent" ontology, then you risk getting a different theory, making "slightly" different predictions. Why? Probably because an ontology risks to make everything too exact, and to remove too much vagueness. Bohmian mechanics has its equilibrium distribution, to bring back the vagueness. Without something similar, you are often forced to "thin out" your ontology, to avoid making experimentally verifiable predictions that differ from standard QM.Why is that? I would expect them to occur at the zeptosecond scale (## 10^{-21} {\rm s} ##), corresponding to the electron mass. I looked at two papers by Tumulka, but they were quite different from what I have in mind. What kind of trouble do you anticipate?

- #29

- 411

- 285

We are probably talking past each other. Why should vagueness be an important ingredient? For me, QED is a fundamentally statistical theory. Does randomness constitute enough "vagueness"? I don't want to create a new theory. I think QED is perfect, and I only aim to see it more clearly as a theory of a special kind of point process in spacetime (actually two five-dimensional manifolds glued together - I'm lacking the proper mathematical term).Probably because an ontology risks to make everything too exact, and to remove too much vagueness.

- #30

Gold Member

- 2,204

- 1,185

So much, said so well, and yet, with very little in the way of a bottom line or an action item. Mostly, what keeps it a clean and elegant, but ultimately useless discussion, is that it doesn't engage with competing proposed resolutions of the open questions and what is at stake if one or another of them is adopted.

https://www.nature.com/articles/s41567-022-01766-x

- #31

Gold Member

- 2,204

- 1,185

Not really a fair criticism.Quantum theory is NOT weird but the most comprehensive theory about Nature we have today.

The fact that a theory is comprehensive and accurate description of Nature doesn't preclude it from being weird.

"Weird" is a word generally used in reference to "common sense" and a typical person's intuition about how the world works. Weird is a close synonym of "counterintuitive."

Every shred of physical intuition gained from daily life, some of it hard wired into our brains, is based on a classical worldview that works for Newtonian mechanics and Maxwell's equations, but fails miserably in the domain of quantum theory in half a dozen or so distinct respects. Yet many people aren't even seriously exposed to quantum theory for the first time until they are college students with STEM majors (assuming that they ever go to college and that they ever major in STEM).

Calling quantum theory "unnatural" would indeed have been inappropriate, but "weird" is right on target.

- #32

Gold Member

- 454

- 362

I'm ok with foundations work being irrelevant to most physicists. My interest in foundations wasn't because I thought foundations work could invigorate other research projects. Almost the opposite: I became interested in foundations in the hope that it could better articulate how physicists and chemists already understand quantum mechanics.

When QM is introduced axiomatically in undergrad, it's usually in terms of some procedure of unitary evolution + measurement + state reduction. But huge areas of physics and chemistry make use of QM without this procedural approach, and alternative interpretations can better overlap with these areas. (E.g. I find consistent histories interesting because it nicely complements the way many of us already think about electron transport calculations in transistors)

- #33

Gold Member

- 557

- 449

That is quite possible. My reaction to your "intended interpretation" was dominated by your reference to GRW (paired with my limited knowledge of why such objective collapse theories differ from standard QM):We are probably talking past each other.

GRW is known to allow both a "flash ontology" (or "flashes ontology") and a "mass density ontology". (I guess that some objective collapse theories are commited to a "mass density ontology", i.e. theories like the Diósi–Penrose model.) So I guessed that your "events" would be similar to the "flashes" for GRW.My preference is a blend of the statistical and transactional interpretations, and ... GRW. At least the original GRW is (to my mind) rather ad hoc, but I like the idea that only events are real. A short coordinated "wiggling" of electrons in a detector, for example, constitutes what we could call the measurement of the polarization of a photon.

The "flashes" for GRW have infinitely accurate spacetime coordinates. For GRW, even their randomness seems to be not enough to get rid of that excess accuracy again. But for Bohmian mechanics, the randomness is sufficient, so an unconditionally true answer to that question seems impossible.Why should vagueness be an important ingredient? ... Does randomness constitute enough "vagueness"?

The trouble with excess accuracy is that the information content of a system with a finite energy in a finite spacetime region should better not be infinite. It is convenient to work with real numbers for mathematical models, but their infinite accuracy forces you to have some mechanism (like "vagueness") to avoid that their infinite accuracy has experimentally observable consequences.

I guess that the mechanism for QFT to get rid of the excess accuracy (related to "point process in spacetime") is renormalization. You find "points of view" like the following in modern QTF1 lecture notes:For me, QED is a fundamentally statistical theory. ... I don't want to create a new theory. I think QED is perfect, and I only aim to see it more clearly as a theory of a special kind of point process in spacetime

Many points of view; one is that it is our own fault: QFT is somewhat idealised; it assumes infinitely extended fields (IR) with infinite spatial resolution (UV); there is no wonder that the theory produces infinities. Still, it is better to stick to idealised assumptions and live with infinities than use some enormous discrete systems (actual solid state systems).

There is also a physics reason why these infinities can be absorbed somehow: Our observable everyday physics should neither depend on how large the universe actually is (IR) nor on how smooth or coarse-grained space is (UV).

- #34

- 3,934

- 521

Hmmm... I don't think it's howEvery shred of physical intuition gained from daily life, some of it hard wired into our brains, is based on a classical worldview that works for Newtonian mechanics and Maxwell's equations

/Fredrik

Last edited:

- #35

- 411

- 285

I don't subscribe to the view that information is physical. It belongs to our theories and models. And we can safely ignore excess digits.The trouble with excess accuracy is that the information content of a system with a finite energy in a finite spacetime region should better not be infinite. It is convenient to work with real numbers for mathematical models, but their infinite accuracy forces you to have some mechanism (like "vagueness") to avoid that their infinite accuracy has experimentally observable consequences.

Yes, it is useful (no, necessary!) to ignore the scales that are not relevant. We couldn't do physics otherwise.I guess that the mechanism for QFT to get rid of the excess accuracy (related to "point process in spacetime") is renormalization.

Thank you. Your posts always contain interesting pointers.You find "points of view" like the following in modern QTF1 lecture notes:

Share:

- Replies
- 5

- Views
- 764

- Replies
- 12

- Views
- 1K

- Replies
- 278

- Views
- 6K

- Replies
- 61

- Views
- 3K

- Replies
- 90

- Views
- 5K

- Replies
- 42

- Views
- 4K

- Replies
- 179

- Views
- 9K

- Replies
- 0

- Views
- 3K

- Replies
- 16

- Views
- 3K

- Replies
- 218

- Views
- 9K