I Question about discussions around quantum interpretations

ojitojuntos
Messages
10
Reaction score
3
TL;DR Summary
Question: If experimentally quantum mechanics has yielded more and more evidence of probabilistic behavior, why is it that there are so many interpretations looking to reintroduce fundamental determinism?
I understand that the world of interpretations of quantum mechanics is very complex, as experimental data hasn't completely falsified the main deterministic interpretations (such as Everett), vs non-deterministc ones, however, I read in online sources that Objective Collapse theories are being increasingly challenged. Does this mean that deterministic interpretations are more likely to be true?

I always understood that the "collapse" or "measurement problem" was how we phrased the fact that there is fundamental randomness in the universe, and that the Bell tests and double-slit experiments support this, but if Objective Collapse theories are being tested and refuted, does this mean that the universe is more likely deterministic and the observed randomness is epistemic?
 
Physics news on Phys.org
Any interpretation could be true. The reason there are so many is that QM is outside everyday experience, so we have no intuition to fall back on.

That said, my suggestion, and I know beginners and those with an interest in what QM is telling us don't like, is not to worry about interpretation until you understand QM to the level of a comprehensive textbook, such as Ballentine's QM: A Modern Development. Ballentine develops QM from just two axioms. And the second axiom actually follows from the first by what is called Gleason's Theorem. QM from just one axiom? Well, strictly speaking, there are seven rules (and a post on this forum discusses them), but they can be presented in a way that makes the others seem natural, which is what Ballentine does. In fact, an interesting exercise for advanced students is to find in the textbook where he uses those rules by the back door.

However, it is essential in understanding interpretations to realise that QM, at its core, is essentially just one rule (or axiom) and what that axiom is. In fact, you may, like I now do, say that QM is that one rule, plus some almost inevitable assumptions, and then evoke Newton's - I make no hypothesis. This becomes more attractive once you realise that every single theory makes assumptions not explained by the theory. I can provide a heuristic justification in the mathematical modelling sense for that rule (in mathematical modelling, when deciding the assumptions of our model, this sort of thing is often done), but I can't derive it; it must be assumed. However, the choice is yours to decide if this is satisfactory or not.

The second important thing is that QM is wrong. We know this because it predicts that the hydrogen atom is in a stationary state, meaning it should not absorb or emit photons. But we know it does. This is addressed by applying an extension of QM called Quantum Field Theory (QFT). However, the modern view of QFT is that it is the inevitable low-energy approximation of any theory at large distances that obeys well-established laws, such as Special Relativity. We have no choice. It is called Weinberg's Folk Theorem, and his advanced three-volume tome on QFT develops QFT from this view, which goes by the name Effective Field Theory (EFT). It even resolves, for energies we can currently probe, combining QM and GR, which, as you may have read, is a big problem. It is, but since we only know EFTs, all our theories have the same problem.

Where does this leave us in terms of interpretation? Basically, our most powerful theory is close to inevitable. To go beyond it, we need information from areas we can't currently probe (directly anyway). Sure, we can hypothesise (String Theory is one such attempt), but as of now, it looks like Einstein was right, QM is incomplete - but not for the reasons he thought. Still, one never knows - it may be the great man has the last laugh.

The answer to your final question is that, since we do not know the theory at the rock bottom of QFT (or even if there is one - it may be turtles all the way down - what a depressing thought - but nature is as nature is), we do not know. Gleason's Theorem suggests it is, but we do not know for sure.

Thanks
Bill
 
Last edited:
  • Like
Likes ojitojuntos and fresh_42
bhobba said:
The second important thing is that QM is wrong.
This is phrased rather confusingly. What you mean is that non-relativistic QM is wrong. But I don't think the OP means to restrict discussion of "quantum mechanics" to just non-relativistic QM.
 
  • Like
Likes ojitojuntos and bhobba
Well, picked up Peter.

Of course, you are correct.

Thanks
Bill
 
  • Like
Likes ojitojuntos
ojitojuntos said:
I always understood that the "collapse" or "measurement problem" was how we phrased the fact that there is fundamental randomness in the universe, and that the Bell tests and double-slit experiments support this, but if Objective Collapse theories are being tested and refuted, does this mean that the universe is more likely deterministic and the observed randomness is epistemic?
There are established probabilistic interpretations distinct from objective collapse, and far more mainstream (to the extent that interpretations can be mainstream). Asher Peres's modern treatise "Quantum Theory: Concepts and Methods", Julian Schwinger's opening essay in "Symbolism of Atomic Measurement", and Roland Omnes's "Understanding Quantum Mechanics" are some of my recommended reading for probabilistic accounts of quantum theories.

There are presumably philosophical motivations for pursuing a deterministic understanding over a probabilistic one.
 
  • Like
Likes ojitojuntos
Thanks a lot for your replies! Sorry for the delayed response, work has been quite busy.
As Peter said, I didn’t mean to limit the discussion to non-relativistic QM.
I read some articles here and there (pop science, you’d call them), and I understand that deterministic interpretations tend to be preferred by people who take the mathematical formalism at face value (Everettian), or those who support non-local hidden variables.
Is this correct? If so, there any experimental support to any of these interpretations?

From what I understand, experiments have led to reaffirm the inherent probabilistic nature of quantum reality, so reincorporating determinism in those ways feels counter-intuitive. Apologies if my questions are too misguided.
 
ojitojuntos said:
From what I understand, experiments have led to reaffirm the inherent probabilistic nature of quantum reality, so reincorporating determinism in those ways feels counter-intuitive. Apologies if my questions are too misguided.
That is the issue. Many people, for whatever reasons, believe that the universe must be fundamentally deterministic. QM challenges that belief. Many physicists accept that QM is evidence that a belief in determinism is not required for physics to make sense. Others will try to find a way to make QM fit into a deterministic model.

Then the question is: who is being counterintuitive here?
 
  • Like
Likes ojitojuntos and bhobba
PeroK said:
That is the issue. Many people, for whatever reasons, believe that the universe must be fundamentally deterministic. QM challenges that belief. Many physicists accept that QM is evidence that a belief in determinism is not required for physics to make sense. Others will try to find a way to make QM fit into a deterministic model.

Then the question is: who is being counterintuitive here?
QM in its usual form seems to be saying that the universe is non-deterministic only when measurements are performed, while the rest of the time it behaves deterministically. For example, a photon isolated from the environment behaves deterministically, until it interacts with a detector. And yet, QM doesn't give a precise definition of "measurement" and "detector". That doesn't seem right from a fundamental point of view. This suggests that non-determinism might be just an effective description emerging from the lack of knowledge of details of complex environments and detectors containing a large number of degrees of freedom, while at the fundamental microscopic level the dynamics is deterministic. That's very intuitive to me.

A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM. Objective collapse theories seem to be very ad hoc and they are largely ruled out by experiments. Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.) The Nelson interpretation is very much like the Bohmian interpretation, except that particles have additional stochastic jiggling.
 
Last edited:
  • Like
Likes ojitojuntos
Demystifier said:
QM in its usual form seems to be saying that the universe is non-deterministic only when measurements are performed, while the rest of the time it behaves deterministically. For example, a photon isolated from the environment behaves deterministically, until it interacts with a detector. And yet, QM doesn't give a precise definition of "measurement" and "detector". That doesn't seem right from a fundamental point of view. This suggests that non-determinism might be just an effective description emerging from the lack of knowledge of details of complex environments and detectors containing a large number of degrees of freedom, while at the fundamental microscopic level the dynamics is deterministic. That's very intuitive to me.

A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM. Objective collapse theories seem to be very ad hoc and they are largely ruled out by experiments. Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.) The Nelson interpretation is very much like the Bohmian interpretation, except that particles have additional stochastic jiggling.
Thanks a lot for replying!
I believe that the fact thanQM hasn’t provided a satisfactory explanation (in the sense that it is canonically relevant) of what a measurement or a detector are, doesn’t necessarily suggest that the underlying mechanics must be deterministic due to our lack of information.
We see non-determinism at a fundamental level, but our lack of understanding of this non-determinism is not by itself an argument that suggests determinism, I think.
Of course, I’m aware that as my formation is in social sciences, Im more inclined to think that nature has a mix of probabilistic and deterministic aspects (the former not being exclusively due to our lack of knowledge).

I think that you’ve provided a very interesting argument on how it is not obvious that probabilistic quantum is the-way-to go, regardless of my inclination. Thanks a lot!
 
  • Like
Likes Demystifier
Back
Top