Exploring Implicit Assumptions and Foundations of Quantum Mechanics

  • Context: Graduate 
  • Thread starter Thread starter Anton_A_Lipovka
  • Start date Start date
  • Tags Tags
    Foundations of physics
Anton_A_Lipovka
Messages
4
Reaction score
1
TL;DR
I would like to start a discussion exploring the foundational aspects of quantum mechanics, focusing on implicit assumptions, Planck’s constant, and the structure of Hilbert space.
Question about the role of postulates and implicit assumptions in quantum mechanics

Hi everyone,

I’m trying to better understand the structure of the postulates of quantum mechanics and whether there is a meaningful hierarchy among them.

In the standard presentation, we usually assume:
- states are vectors in a Hilbert space,
- observables are represented by Hermitian operators,
- measurement outcomes follow the Born rule.

At the same time, when looking at the historical development (for example, Schrödinger’s original work), it seems that several additional assumptions are implicitly introduced, even if not always stated explicitly. For instance:

- the existence of a wavefunction ψ describing the system,
- a specific dynamical law (the Schrödinger equation),
- and the appearance of constants like Planck’s constant setting the scale.

This makes me wonder whether it is meaningful to think of these latter ingredients as more “primitive”, more “fundamental”, in the sense that the usual Hilbert space formalism and measurement postulates might emerge from them, or at least be motivated by them.

So my question is:

Is there a well-defined sense in which the standard postulates can be organized hierarchically, or are they generally viewed as independent axioms of the theory?

I would especially appreciate any clarification or references.

Thanks!
 
  • Like
Likes   Reactions: Peter Morgan
Physics news on Phys.org
In axiomatic approaches to Quantum Field Theory, which I think any attempt to understand quantum theory will eventually have to engage with, it's commonly the algebra of operators and the states that are taken to be fundamental. That's approximately what you have in your first set of three assumptions. More is needed, however, to make contact with experiment.
I think your two sets of three assumptions are too vague as they are, but the way you have stated them is very reminiscent of Von Neumann's axioms for quantum theory, which is still the basis of most axiomatic presentations of QM 94 years later, so have a close look at them.
You've made this a graduate-level question, so I'll suggest a much cited article by Abramsky&Brandenburger, "The sheaf-theoretic structure of non-locality and contextuality", in NewJPhys 2011 (which is Open Access).
An interesting undergraduate-to-graduate-level textbook that compares algebraic and other formalisms, if you can get access to it, is François David's "The Formalisms of Quantum Mechanics" (not OA, preprint on arXiv.)
Klaas Landsman's "Algebraic quantum mechanics" is a nicely succinct handbook account (not OA; almost everything by Klaas Landsman is worthwhile).
A recent preprint that might be worthwhile is Falco&Matthies, "Vistas of Algebraic Probability, Quantum Computation and Information". They include many references: at the graduate level, the literature is enormous even about only algebraic QM.
 
  • Like
Likes   Reactions: bhobba, iste, Anton_A_Lipovka and 1 other person
Anton_A_Lipovka said:
This makes me wonder whether it is meaningful to think of these latter ingredients as more “primitive”, more “fundamental”, in the sense that the usual Hilbert space formalism and measurement postulates might emerge from them
I think it's the other way around.

Assuming a wave function amounts to assuming a particular Hilbert space (the space of square integrable functions over the relevant configuration space).

Assuming the Schrodinger Equation amounts to assuming a particular choice of reference frame and using the non-relativistic approximation.

Planck's constant fixing the scale arises when we try to correlate the theoretical equations with actual measurements.

So those things aren't more fundamental, they emerge from the other stuff once you start trying to work with it for solving particular problems.
 
  • Love
Likes   Reactions: bhobba
To elaborate further on Peter's excellent answer (having posted for over 15 years now, they always are), in my recent posts, I have emphasised non-relativistic standard QM, as you find in excellent books like Lenny Susskind (if you are a lay person with a smattering of calculus) or an advanced book like Ballentine is wrong. You must go to relativistic Quantum Field Theory (QFT). While most, including me, naturally, thought in the non-relativistic limit, you get standard QM; it came as a bolt from the blue when I read an advanced paper showing the situation is more complex.

https://arxiv.org/abs/1712.06605

Now I think of ordinary QM as an approximation rather than a limiting case. I won't post it again, but I have posted in other threads, from a mathematical modelling perspective, how the approximation is reasonable, assuming no knowledge of QM, only that in QFT we have interacting fields. Ballentine gives the detailed mathematial consequences of the model; with his advocacy of the statistical interpretation that Einstein also favoured (to those new to interpretative issues, it may come as a surprise that Einstein accepted QM as valid, just incomplete, which we know is true - QFT is more complete, but perhaps there is still an even deeper theory) as a simple way to look at that approximation.

Thanks
Bill
 
Last edited:
Doing physics or science in general amount to doing something like plumbing an old house. As long as you can get the water running in the right places with as little ugliness as possible then it is good enough until the next renovation. Then you can document. Anybody who has done his PHD/Masters knows that.
 
Peter Morgan said:
In axiomatic approaches to Quantum Field Theory, which I think any attempt to understand quantum theory will eventually have to engage with, it's commonly the algebra of operators and the states that are taken to be fundamental. That's approximately what you have in your first set of three assumptions. More is needed, however, to make contact with experiment.
Thank you for your detailed and thoughtful reply. I particularly appreciate the references you provided, as well as the time you have taken to engage with my questions. It is a pleasure to discuss these issues with you.

You are certainly correct in noting that “more is needed, however, to make contact with experiment”, and in my opinion the main point here is “experiment”. Indeed, quantum theory involves a broader set of axioms. In my earlier post, I singled out a second group of assumptions—primarily for illustrative, historical reasons—namely:
  • the existence of a wavefunction ψ describing the system,
  • a specific dynamical law (the Schrödinger equation),
  • and the appearance of dimensional constants such as Planck’s constant.
These were taken in the spirit of Erwin Schrödinger’s original formulation, rather than as a complete axiomatization.

I also fully agree with your statement that in any attempt to understand quantum theory, we will ultimately face the need to understand the origins of the axioms in the axiomatic approach to quantum field theory. This is precisely the motivation behind my interest in a possible hierarchy of axioms. In particular, I have in mind programs often described as a geometrization of physics, where physical structures—fields and equations—emerge from properties of the metric (which, in general, may be dynamical).

It is quite possible that my original formulation was not sufficiently clear, so let me restate the point more carefully.

There appear to be (at least heuristically) two distinct approaches to QM and QFT, roughly corresponding to the two groups of axioms mentioned above.

The first is what I referred to as a “top” approach: one postulates abstract mathematical structures—Hilbert spaces, or in QFT Fock spaces—and specifies operators and their algebraic properties. In this way, the formalism is constructed axiomatically at a structural level.

The second, “bottom” approach (historically earlier), is based on experiment and is associated with the original formulations of quantum mechanics. Here, the starting point is a set of empirically motivated postulates, such as those listed above.

I would note that most areas of physics — electrostatics, heat fluxes, diffusion, and many others — are constructed in this “bottom-up” manner, starting from the postulates which are ultimately grounded in experiment. By contrast, in QM and QFT, the “top-down” axiomatic structure has proven extremely powerful as a computational and organizational framework, which may explain its prominence.

However, if the goal is to understand the physical meaning of the axioms themselves, I don't think this algebraic (“top”) approach will be successful.

On the one hand, algebraic formulations of probability and operator structures are extremely useful for organizing the theory, but they do not, by themselves, explain the origin of probability or the role of constants such as ℏ. For this reason, the “bottom” approach—being more directly tied to experimental structure—may be more informative when addressing foundational questions.

On the other hand, if one aims at a unification of different branches of physics, it seems unavoidable that the starting point must again be the experimentally grounded level, rather than purely formal algebraic constructions.

To illustrate this point, consider the following example.

The Sturm–Liouville problem, which leads to discrete spectra, appears throughout classical physics (electrostatics, heat conduction, diffusion, etc.). A particularly relevant case is the Fokker–Planck equation, whose formal similarity to the Schrödinger equation is well known.

Consider the diffusion equation:
(1)  g² ∂²N/∂x² − ∂N/∂t = 0, where g² = D (D is the diffusion coefficient),
together with appropriate initial and boundary conditions defining a Sturm–Liouville problem.

The Schrödinger equation,
(ℏ²/2m) ∂²N/∂x² + iℏ ∂N/∂t = 0,
can be written in the same formal form:
(2)  g² ∂²N/∂x² − ∂N/∂t = 0, where g² = D′ = iℏ/2m is a modified diffusion coefficient.
(Here I have replaced ψ with N; this does not affect the structure of the solution.)

With, for example, Neumann boundary conditions, the solution takes the familiar form:
N(x,t) = Σ Aₙ cos(nπx/a) exp[−(nπg/a)² t],
where the coefficients Aₙ are determined from the initial condition:
N(x,0) = f(x), Aₙ = ⟨f(x) | Nₙ(x,0)⟩.

The formal difference between the diffusion and quantum cases is entirely contained in the coefficient:
  • Classical diffusion: g² = D,
  • Quantum case: g² = iℏ/2m.
Substituting this into the solution yields:
(3)  N(x,t) = Σ Aₙ cos(nπx/a) exp[−(i/ℏ) Eₙ t],
with Eₙ = (ℏ²/2m)(nπ/a)².

Thus, essentially the same mathematical structure (Hilbert space, operators, Sturm–Liouville theory) appears in both contexts. However, in diffusion theory the model is clearly constructed from experimentally grounded considerations, whereas in QM the axiomatic algebraic structure is often taken as primary.

Given this, I would like to restate my question more precisely:

Could the “bottom” approach—i.e., constructing QM starting from experimentally motivated axioms—be more effective in clarifying the origin and meaning of those axioms (in particular, the second group mentioned earlier), and perhaps even reducing their number?

If so, then understanding the nature of these fundamental assumptions becomes especially important, since they provide the link to experiment. This includes questions such as the origin of Planck’s constant, the interpretation of the wavefunction, and the implicit assumption that the system under consideration is closed.

The latter point raises additional concerns. On the one hand, the spacetime metric is not static (e.g., cosmological redshift), whereas the standard formulations of QM and QFT are typically constructed in inertial frames without incorporating such effects into the Hamiltonian. On the other hand, the absence of explicit transverse electromagnetic degrees of freedom in the most basic formulations (e.g. equations of Schrödinger, of Klein-Gordon, of Dirac) may suggest that the system under consideration is, in some sense, incomplete.
 
selfsimilar said:
Doing physics or science in general amount to doing something like plumbing an old house. As long as you can get the water running in the right places with as little ugliness as possible then it is good enough until the next renovation. Then you can document. Anybody who has done his PHD/Masters knows that.
What you've described, it seems to me, is quite difficult to classify as science or physics. It's more like engineering. You may be right, and indeed, anyone who earns a master's degree is interested in engineering. However, after 30 years have passed since earning a doctorate, someone interested in physics begins to see things differently, and their interests shift. And after 40 years have passed since earning a doctorate, many begin to realize that installing plumbing in a house that's already dilapidated is of little use. For this reason, engineering has little interest to me. But in any case, I thank you for your comment.
 
PeterDonis said:
So those things aren't more fundamental, they emerge from the other stuff once you start trying to work with it for solving particular problems.
Thank you for your comment.
You are absolutely right when you write that "they arise from other components and are not fundamental." I completely agree with this statement. My question concerns precisely the foundation and the easiest way to access it. Above, in my response to Peter Morgan, I clarified my question and provided a comparison of two different branches of physics. From this comparison, it is clear that an abstract axiomatic approach is unlikely to give us an understanding of the physics of what is happening. Therefore, in my opinion, if we want to move beyond interpretations and understand the foundations of quantum mechanics, we will be forced to return to the early 20th century and retrace our steps.
It seems to me that the key to this is understanding the nature of Planck's constant. As one of many examples, I can cite the work of Calogero (see Calogero, F (1997). "Cosmic origin of quantization". Physics Letters A. 228 (6). Elsevier BV: 335–346. Bibcode:1997PhLA..228..335C ). Unfortunately, this work was based on dimensionality considerations. This approach is sometimes acceptable in hydrodynamics, but it is completely inapplicable when it is necessary to combine two fundamentally different fields (QM and GTR). Therefore, Calogero's work can hardly be called satisfactory.

As I wrote above to Peter Morgan, the entire Standard Model is built on the axiom of the superposition principle, which, in turn, is based on the inertial nature of the reference frame. In other words, the very language in which QM and QFT are constructed is the language of inertial reference frames and cannot be applied outside its domain of definition. At the same time, GR is essentially not an inertial theory, in which the superposition principle is clearly violated. Therefore, the conclusion follows about the fundamental incompatibility of QFT and GR, even at the level of the initial axioms.

Thus, we once again run into a closed door, with a lock called Planck's constant.
It seems to me that this lock can be opened with a key called an "adiabatically variable metric," which takes into account the expansion of the Universe.
 
  • Sad
Likes   Reactions: weirdoguy
Anton_A_Lipovka said:
the entire Standard Model is built on the axiom of the superposition principle, which, in turn, is based on the inertial nature of the reference frame.

How? Superposition means that equations are linear.

Anton_A_Lipovka said:
n other words, the very language in which QM and QFT are constructed is the language of inertial reference frames and cannot be applied outside its domain of definition.

You can do both QM and QFT in non-inertial frames, it's just tedious and gives almost nothing (besides e.g. Unruh radiation).
 
  • Like
Likes   Reactions: bhobba and PeterDonis
  • #10
Anton_A_Lipovka said:
It seems to me that this lock can be opened with a key called an "adiabatically variable metric," which takes into account the expansion of the Universe.
Do you have a reference for this? Please bear in mind that personal theories and personal research are out of bounds for discussion here at PF.
 
  • Like
Likes   Reactions: bhobba

Similar threads

  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 37 ·
2
Replies
37
Views
7K
  • · Replies 42 ·
2
Replies
42
Views
9K
  • · Replies 28 ·
Replies
28
Views
3K
  • · Replies 376 ·
13
Replies
376
Views
25K
  • · Replies 61 ·
3
Replies
61
Views
7K
  • · Replies 40 ·
2
Replies
40
Views
5K
  • · Replies 218 ·
8
Replies
218
Views
17K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K