gentzen said:
To my surprise, Jan Faye seems to indicate that this
contextuality might even have been a crucially important part of Bohr's beloved complementarity:
Jan Faye said:
In general, Bohr considered the demands of complementarity in quantum mechanics to be logically on a par with the requirements of relativity in the theory of relativity. He believed that both theories were a result of novel aspects of the observation problem, namely the fact that observation in physics is context-dependent.
Thanks for this,
@gentzen. I found it on
the CI of QM page for SEP. What I now find remarkable is that
contextuality is a very natural classical concept that we can easily introduce into classical physics. In particular, Koopman's Hilbert space formalism for Classical Mechanics
1 allows us very easily to use the Poisson bracket to generate changes of experimental contexts — more generally than as only canonical transformations, as in conventional Hamiltonian CM. This is a natural extension of Bohr's insistence that we must describe experiments classically, insofar as it allows us to describe the expected statistics of many incompatible experiments in a single probabilistic structure.
Having introduced contextuality so easily into CM, we have to wonder what
quantum fluctuations are in this larger classical world. Clearly they cannot be the same as thermal fluctuations. A first clue is that ℏ has units of action, whereas kT has units of energy, but this does not give much direction. Much more substantively, QFT is clear that the vacuum is Poincaré invariant, whereas thermal fluctuations are not invariant under boost transformations: this difference of symmetry properties under the action of the Poincaré group is perhaps the most natural and easiest distinction we could think of to introduce into classical physics.
2 With contextuality and quantum fluctuations added into CM, we have what I call 'CM+' in my AnnPhys 2020. Just these two additions give us a measurement theory for CM+ that is as empirically effective as that of QFT.
We can also take this construction in the other direction, following the example of Tsang&Caves in PRX 2012, in terms of Qunatum Non-Demolition measurement, which allows us to construct isomorphisms between the physically significant Quantum Optics (aka
Quantum ElectroMagnetism, QEM) and what I call QND Optics.
This has been out in the literature for five years so far, and enough people think it and its further development has raised the bar significantly that I have given
19 talks to academic and other audiences since then, but I have clearly not yet raised the bar high enough for
@Fra's standard:
Fra said:
I wonder if anyone ever read a clear and brilliant writing about the conceptual foundations of QM?
I more expect that these or similar ideas will make it into the Zeitgeist not through my obscure writing. More significant than understanding the relationship between QM and CM+ as about isomorphisms as well as about quantization is to me that CM+ thought of from the engineering perspective of signal analysis suggests how we can rethink renormalization, but about that I will not test your patience.
1 Koopman introduced the idea that a Hilbert space formalism for CM is possible
already in 1931, then it was immediately used by von Neumann and Birkhoff to prove the ergodic theorem, but thereafter it was almost unmentioned until
Sudarshan in Pramana 1976. Since then, it's been used extensively in the dynamical systems literature (see
"Modern Koopman Theory for Dynamical Systems", for example). The algebra of bounded operators that act on a Hilbert space contains the canonical transformations as a subgroup.
2 Stochastic ElectroDynamics, SED, adopted this distinction already in the 1960s, when it introduced
Zero Point Fluctuations, ZPF, but SED has failed to be compelling for most physicists,
for ~60 years, I think because it does not have a measurement theory that includes contextuality.