I Question about discussions around quantum interpretations

ojitojuntos
Messages
15
Reaction score
8
TL;DR Summary
Question: If experimentally quantum mechanics has yielded more and more evidence of probabilistic behavior, why is it that there are so many interpretations looking to reintroduce fundamental determinism?
I understand that the world of interpretations of quantum mechanics is very complex, as experimental data hasn't completely falsified the main deterministic interpretations (such as Everett), vs non-deterministc ones, however, I read in online sources that Objective Collapse theories are being increasingly challenged. Does this mean that deterministic interpretations are more likely to be true?

I always understood that the "collapse" or "measurement problem" was how we phrased the fact that there is fundamental randomness in the universe, and that the Bell tests and double-slit experiments support this, but if Objective Collapse theories are being tested and refuted, does this mean that the universe is more likely deterministic and the observed randomness is epistemic?
 
Physics news on Phys.org
Any interpretation could be true. The reason there are so many is that QM is outside everyday experience, so we have no intuition to fall back on.

That said, my suggestion, and I know beginners and those with an interest in what QM is telling us don't like, is not to worry about interpretation until you understand QM to the level of a comprehensive textbook, such as Ballentine's QM: A Modern Development. Ballentine develops QM from just two axioms. And the second axiom actually follows from the first by what is called Gleason's Theorem. QM from just one axiom? Well, strictly speaking, there are seven rules (and a post on this forum discusses them), but they can be presented in a way that makes the others seem natural, which is what Ballentine does. In fact, an interesting exercise for advanced students is to find in the textbook where he uses those rules by the back door.

However, it is essential in understanding interpretations to realise that QM, at its core, is essentially just one rule (or axiom) and what that axiom is. In fact, you may, like I now do, say that QM is that one rule, plus some almost inevitable assumptions, and then evoke Newton's - I make no hypothesis. This becomes more attractive once you realise that every single theory makes assumptions not explained by the theory. I can provide a heuristic justification in the mathematical modelling sense for that rule (in mathematical modelling, when deciding the assumptions of our model, this sort of thing is often done), but I can't derive it; it must be assumed. However, the choice is yours to decide if this is satisfactory or not.

The second important thing is that QM is wrong. We know this because it predicts that the hydrogen atom is in a stationary state, meaning it should not absorb or emit photons. But we know it does. This is addressed by applying an extension of QM called Quantum Field Theory (QFT). However, the modern view of QFT is that it is the inevitable low-energy approximation of any theory at large distances that obeys well-established laws, such as Special Relativity. We have no choice. It is called Weinberg's Folk Theorem, and his advanced three-volume tome on QFT develops QFT from this view, which goes by the name Effective Field Theory (EFT).

Added Later:
For technical details of this view, I found a nice paper:
https://research.engineering.nyu.edu/~jbain/papers/Weinberg.pdf

It even resolves, for energies we can currently probe, combining QM and GR, which, as you may have read, is a big problem. It is, but since we only know EFTs, all our theories have the same problem.

Where does this leave us in terms of interpretation? Basically, our most powerful theory is close to inevitable. To go beyond it, we need information from areas we can't currently probe (directly anyway). Sure, we can hypothesise (String Theory is one such attempt), but as of now, it looks like Einstein was right, QM is incomplete - but not for the reasons he thought. Still, one never knows - it may be the great man has the last laugh.

The answer to your final question is that, since we do not know the theory at the rock bottom of QFT (or even if there is one - it may be turtles all the way down - what a depressing thought - but nature is as nature is), we do not know. Gleason's Theorem suggests it is, but we do not know for sure.

Thanks
Bill
 
Last edited:
  • Like
  • Informative
Likes sbrothy, ojitojuntos and fresh_42
bhobba said:
The second important thing is that QM is wrong.
This is phrased rather confusingly. What you mean is that non-relativistic QM is wrong. But I don't think the OP means to restrict discussion of "quantum mechanics" to just non-relativistic QM.
 
  • Like
Likes sbrothy, ojitojuntos and bhobba
Well, picked up Peter.

Of course, you are correct.

Thanks
Bill
 
  • Like
Likes ojitojuntos
ojitojuntos said:
I always understood that the "collapse" or "measurement problem" was how we phrased the fact that there is fundamental randomness in the universe, and that the Bell tests and double-slit experiments support this, but if Objective Collapse theories are being tested and refuted, does this mean that the universe is more likely deterministic and the observed randomness is epistemic?
There are established probabilistic interpretations distinct from objective collapse, and far more mainstream (to the extent that interpretations can be mainstream). Asher Peres's modern treatise "Quantum Theory: Concepts and Methods", Julian Schwinger's opening essay in "Symbolism of Atomic Measurement", and Roland Omnes's "Understanding Quantum Mechanics" are some of my recommended reading for probabilistic accounts of quantum theories.

There are presumably philosophical motivations for pursuing a deterministic understanding over a probabilistic one.
 
  • Like
Likes ojitojuntos
Thanks a lot for your replies! Sorry for the delayed response, work has been quite busy.
As Peter said, I didn’t mean to limit the discussion to non-relativistic QM.
I read some articles here and there (pop science, you’d call them), and I understand that deterministic interpretations tend to be preferred by people who take the mathematical formalism at face value (Everettian), or those who support non-local hidden variables.
Is this correct? If so, there any experimental support to any of these interpretations?

From what I understand, experiments have led to reaffirm the inherent probabilistic nature of quantum reality, so reincorporating determinism in those ways feels counter-intuitive. Apologies if my questions are too misguided.
 
ojitojuntos said:
From what I understand, experiments have led to reaffirm the inherent probabilistic nature of quantum reality, so reincorporating determinism in those ways feels counter-intuitive. Apologies if my questions are too misguided.
That is the issue. Many people, for whatever reasons, believe that the universe must be fundamentally deterministic. QM challenges that belief. Many physicists accept that QM is evidence that a belief in determinism is not required for physics to make sense. Others will try to find a way to make QM fit into a deterministic model.

Then the question is: who is being counterintuitive here?
 
  • Like
Likes ojitojuntos and bhobba
PeroK said:
That is the issue. Many people, for whatever reasons, believe that the universe must be fundamentally deterministic. QM challenges that belief. Many physicists accept that QM is evidence that a belief in determinism is not required for physics to make sense. Others will try to find a way to make QM fit into a deterministic model.

Then the question is: who is being counterintuitive here?
QM in its usual form seems to be saying that the universe is non-deterministic only when measurements are performed, while the rest of the time it behaves deterministically. For example, a photon isolated from the environment behaves deterministically, until it interacts with a detector. And yet, QM doesn't give a precise definition of "measurement" and "detector". That doesn't seem right from a fundamental point of view. This suggests that non-determinism might be just an effective description emerging from the lack of knowledge of details of complex environments and detectors containing a large number of degrees of freedom, while at the fundamental microscopic level the dynamics is deterministic. That's very intuitive to me.

A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM. Objective collapse theories seem to be very ad hoc and they are largely ruled out by experiments. Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.) The Nelson interpretation is very much like the Bohmian interpretation, except that particles have additional stochastic jiggling.
 
Last edited:
  • Like
Likes syed and ojitojuntos
Demystifier said:
QM in its usual form seems to be saying that the universe is non-deterministic only when measurements are performed, while the rest of the time it behaves deterministically. For example, a photon isolated from the environment behaves deterministically, until it interacts with a detector. And yet, QM doesn't give a precise definition of "measurement" and "detector". That doesn't seem right from a fundamental point of view. This suggests that non-determinism might be just an effective description emerging from the lack of knowledge of details of complex environments and detectors containing a large number of degrees of freedom, while at the fundamental microscopic level the dynamics is deterministic. That's very intuitive to me.

A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM. Objective collapse theories seem to be very ad hoc and they are largely ruled out by experiments. Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.) The Nelson interpretation is very much like the Bohmian interpretation, except that particles have additional stochastic jiggling.
Thanks a lot for replying!
I believe that the fact thanQM hasn’t provided a satisfactory explanation (in the sense that it is canonically relevant) of what a measurement or a detector are, doesn’t necessarily suggest that the underlying mechanics must be deterministic due to our lack of information.
We see non-determinism at a fundamental level, but our lack of understanding of this non-determinism is not by itself an argument that suggests determinism, I think.
Of course, I’m aware that as my formation is in social sciences, Im more inclined to think that nature has a mix of probabilistic and deterministic aspects (the former not being exclusively due to our lack of knowledge).

I think that you’ve provided a very interesting argument on how it is not obvious that probabilistic quantum is the-way-to go, regardless of my inclination. Thanks a lot!
 
  • Like
Likes PeroK and Demystifier
  • #10
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic. The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.

Even though the state of the isolated atom evolves deterministically, that evolution inherently introduces probabilities that cannot be attributed to the detection process. The detection process is a part of nature where the inherent probabilities manifest themselves.

Interaction, measurement, detection is part of nature. Even if QM struggles to describe what happens in a satisfactory manner, it's stretching a point to say that radioactive decay is inherently deterministic.
 
  • Like
Likes ojitojuntos, Lord Jestocost and martinbn
  • #11
PS we could illustrate the point by borrowing Schrodinger's cat. After each run of the experiment, the cat is either alive or dead. There is no way to determine this from the start by knowing everything there is to know about the cat and the apparatus. It doesn't depend on that. It depends only on the probabilistic decay process.

So, unless the radioactive atom has hidden properties that determine its time of decay, we have an inherently non-deterministic experiment.

That is not to say that by sufficiently clever means a deterministic explanation cannot be found. E.g multi worlds or Bohmian mechanics.
 
  • Like
Likes bhobba and ojitojuntos
  • #12
PeroK said:
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic. The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.

Even though the state of the isolated atom evolves deterministically, that evolution inherently introduces probabilities that cannot be attributed to the detection process. The detection process is a part of nature where the inherent probabilities manifest themselves.

Interaction, measurement, detection is part of nature. Even if QM struggles to describe what happens in a satisfactory manner, it's stretching a point to say that radioactive decay is inherently deterministic.
If you agree that isolated atom evolves deterministically, then non-determinism must be associated with interaction with the environment, would you agree? So even if each atom is identical, that cannot be said for each environment. Hence you cannot rule out the possibility that the time of decay is somehow determined by the details of the environment, but looks random to us only because we don't know these details in practice.
 
  • Like
  • Skeptical
Likes syed, PeroK and gentzen
  • #13
PeroK said:
There is no way to determine this from the start by knowing everything there is to know about the cat and the apparatus. It doesn't depend on that.
I don't understand your logic. Sure, a human cannot know everything there is to know about the cat and the apparatus. But just because a human cannot know something doesn't imply that nature doesn't depend on that. Take a classical coin toss for example, a human cannot know all the fine details of initial data that determine the result of coin toss, that's why the coin toss looks like a random event to us, and yet the result of coin toss in classical physics is inherently deterministic.
 
  • #14
Demystifier said:
If you agree that isolated atom evolves deterministically, then non-determinism must be associated with interaction with the environment, would you agree? So even if each atom is identical, that cannot be said for each environment. Hence you cannot rule out the possibility that the time of decay is somehow determined by the details of the environment, but looks random to us only because we don't know these details in practice.
The precise details of the uncertainty are encoded in the atomic state. In the weighing/probability amplitude associated with the decay or not substates. You don't need uncertainty in the environment. If it was uncertainty in the environment, then you could create an environment where the atom had a different half life. Where the environment could override the probability amplitudes of the state.

It is simpler and more intuitive to me to assume that the environment manifests the probabilities inherent in the atomic state.

Ultimately, the weakness of your argument is that you postulate a mechanism that doesn't add anything in terms of experimental results. Moreover, if that mechanism were genuinely part of the model, there should be experiments that defy the simple model that all information about the probability of decay is contained within the atomic state.
 
  • #15
Demystifier said:
I don't understand your logic. Sure, a human cannot know everything there is to know about the cat and the apparatus. But just because a human cannot know something doesn't imply that nature doesn't depend on that. Take a classical coin toss for example, a human cannot know all the fine details of initial data that determine the result of coin toss, that's why the coin toss looks like a random event to us, and yet the result of coin toss in classical physics is inherently deterministic.
The difference is that we do know everything about the state of the atom and its decay rate. And that is sufficient to explain the quantitative probabilities. Whereas, there is nothing inherent in the coin that produces the probabilities. The probabilities are created by what you do with the coin.

The analogy fails because the state of the atom evolves spontaneously without any environment and the quantities associated with probabilities emerge spontaneously from nothing but time evolution of the state.

Although the time evolution itself is deterministic, the probability amplitudes evolve. Which means the probabilities are predictable. That's how nature works. Probability amplitudes emerge and evolve deterministically. There is no contradiction there.
 
  • #16
PeroK said:
If it was uncertainty in the environment, then you could create an environment where the atom had a different half life. Where the environment could override the probability amplitudes of the state.
You can--by creating an environment where the quantum states of the decay products are already occupied, or partially occupied. (And for the case of electron capture, you can change the half-life of the reaction by ionizing or partly ionizing the atom, so the electron availability for capture is changed.)
 
  • Informative
Likes gentzen and PeroK
  • #17
PeterDonis said:
You can--by creating an environment where the quantum states of the decay products are already occupied, or partially occupied. (And for the case of electron capture, you can change the half-life of the reaction by ionizing or partly ionizing the atom, so the electron availability for capture is changed.)
It's true, those ideas complicate the picture. But, even in that case, the probabilities can be understood through the evolution of the state, and don't need an additional, unspecified process to produce the (expected) probabilities.
 
  • #18
PeroK said:
even in that case, the probabilities can be understood through the evolution of the state
Yes, but now it's the state of atom plus environment, not just the state of the atom.
 
  • #19
To change the subject slightly. I remember learning about neutrino oscillations and thinking what a beautiful, simple model. The neutrino state evolves independent of any detector. The free neutrino is not an eigenstate of flavour, so the probability of detecting a given flavour varies with time (independent of the detector).

To reject that as a sufficient explanation for probabilities seems unintuitive to me. But also by rejecting the fundamental probability amplitudes associated with QM potentially makes it difficult to make progress in particle physics.

This is the obvious criticism of BM. Instead of accepting nature as fundamentally probabilistic, it gets bogged down in an unnecessary additional layer of detail that inhibits further progress.
 
  • Like
Likes ojitojuntos and martinbn
  • #20
@PeroK

1. In the paper https://arxiv.org/abs/2010.07575 Sec. 4.3 I have explained why the decay does not depend on details of the environment, creating an illusion that the environment is not needed for the decay, and yet why there is no decay without the environment.

2. You said yourself that isolated atom evolves deterministically. Then how can you think that environment is not needed for the non-deterministic decay?

3. The analogy with coin tossing is useful again. From the symmetric shape of the coin itself we can conclude that p(head)=p(tail)=1/2. We don't need any other variables describing the environment or position and velocity of the coin. And yet, to understand why the outcome of coin tossing appears random we need exactly that. We need the idea that the outcome depends on many fine details that we cannot know in practice to explain the apparent randomness, even though we don't need any of this to determine the probabilities of specific outcomes.
 
  • #21
PeroK said:
If it was uncertainty in the environment, then you could create an environment where the atom had a different half life. Where the environment could override the probability amplitudes of the state.
And this is exactly what you can do, by the quantum Zeno effect.
 
  • #22
PeroK said:
This is the obvious criticism of BM. Instead of accepting nature as fundamentally probabilistic, it gets bogged down in an unnecessary additional layer of detail that inhibits further progress.
But that's how the whole of physics, and indeed, the whole of science, is organized, into layers. Just one example: Fluids are made of atoms and atomic physicists explore it. But fluid mechanics and its applications in airplane industry does not depend on this atomic structure, they model fluids as continuous and worrying about the atomic structure would slow down the progress in fluid mechanics. Nobody says that fluid mechanists should worry about the atomic structure. But that doesn't mean that fluid mechanists should deny that atoms exist. Likewise, a particle physicist does not need to worry about quantum foundations, that would slow him down, but that doesn't mean that a particle physicist should deny that there are unsolved problems in quantum foundations which some physicists explore.
 
  • #23
My obligatory defense of consistent histories
Demystifier said:
A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM.
The consistent histories formalism is nonstandard in the sense that the construction of history operators is somewhat novel. But all the underlying machinery is standard QM (Hilbert spaces, projectors, density operators etc).
Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.).
A choice of framework is a choice of a description of reality, and different descriptions are appropriate for different purposes. But reality isn't contingent on a physicist's choice of description.
 
  • #24
Morbert said:
A choice of framework is a choice of a description of reality, and different descriptions are appropriate for different purposes. But reality isn't contingent on a physicist's choice of description.
Sometimes I think of consistent histories not as one interpretation, but as an infinite class of interpretations, where each framework corresponds to one interpretation. Since all interpretations are compatible with measurable facts, one is free to choose any interpretation (framework) one likes. Does it make sense to you?
 
  • #25
Demystifier said:
But that's how the whole of physics, and indeed, the whole of science, is organized, into layers. Just one example: Fluids are made of atoms and atomic physicists explore it. But fluid mechanics and its applications in airplane industry does not depend on this atomic structure, they model fluids as continuous and worrying about the atomic structure would slow down the progress in fluid mechanics. Nobody says that fluid mechanists should worry about the atomic structure. But that doesn't mean that fluid mechanists should deny that atoms exist. Likewise, a particle physicist does not need to worry about quantum foundations, that would slow him down, but that doesn't mean that a particle physicist should deny that there are unsolved problems in quantum foundations which some physicists explore.
I understand your point. Let's assume you are correct.

The detector has trillions of possible states, but these neatly divide into two approximately equal numbers of states. Half that inevitably detect decay and half that do not.

However, if we wait two half lives, then somehow the detector's states split 75-25. And so on. Somehow these random detector states are attuned to the probability amplitudes of the atom?

I would imagine that macroscopic devices would suffer from problems in manifesting the simple Born rule as per the state of the atom.

Once you have the state of the detector being relevant, why do we get the Born rule based only on amplitudes of the atom? Why does your average detector not screw up these probabilities? That an average detector would have some influence on the probabilities?

The detector would have to be a curious mixture of a huge number of random degrees of freedom, which jointly with the simple state of atom determine a definite result. Yet, the probability that the state of the detector is such that decay is observed is tuned perfectly to the simple state of the atom.
 
  • #26
I'm not saying it's impossible. But, if the state of the detector is irrelevant, then the Born rule makes sense. Only the state of the atom determines the probabilities.

Rather than the probability that the detector is in a certain state being aligned with the probability amplitudes in the state of the atom.

PS the special cases, like the quantum Zeno effect, are cases where you do tune the environment deliberately to affect the outcome.
 
Last edited:
  • #27
PeroK said:
Once you have the state of the detector being relevant, why do we get the Born rule based only on amplitudes of the atom? Why does your average detector not screw up these probabilities? That an average detector would have some influence on the probabilities?
I have addressed this in post #25. In item 1. I gave a reference where your question is answered in more detail and in item 3. I gave an analogy with coin tossing.
 
  • #28
PeroK said:
Only the state of the atom determines the probabilities.
One should distinguish determination of probability from explanation of randomness. In pure math, the Kolmogorov axioms say how to determine probability without any reference to randomness. More physically, as I explained in post #25 item 3., coin tossing is a very simple example where probability is determined by intrinsic properties of the coin itself, while the explanation of randomness depends also on the existence of the environment.
 
  • #29
Demystifier said:
I have addressed this in post #25. In item 1. I gave a reference where your question is answered in more detail and in item 3. I gave an analogy with coin tossing.
Okay. Let me read your paper. I'm just up the road from you, climbing mountains in Slovenia for the past couple of weeks.
 
  • Like
Likes Motore and Demystifier
  • #30
Demystifier said:
coin tossing is a very simple example where probability is determined by intrinsic properties of the coin itself
I don't think this is true. I believe robotic machines have been developed which can flip coins such that one side is more likely to land up than the other. So at the very least, the process by which the coin is flipped has something to do with the probability. And that process is not intrinsic to the coin.
 
  • Like
Likes PeroK and ojitojuntos
  • #31
Demystifier said:
3. The analogy with coin tossing is useful again. From the symmetric shape of the coin itself we can conclude that p(head)=p(tail)=1/2.
But that's only true before the coin is tossed, right?

Before tossing an unbiased coin the probability is 1/2. When we toss it, that probability changes. The environment during the toss continues to change that probability. Finally, the surface where the coin lands also changes the probability completely.

The coin toss is an example where information exists that determines probability and outcomes, not like QM. Even if UP exists and QM is probabilistic, the evolution of probability and state in QM is deterministic.
 
  • #32
Thread closed for moderation.
 
  • Like
Likes weirdoguy and ojitojuntos
  • #33
After some cleanup, the thread is reopened.
 
  • Like
Likes weirdoguy and ojitojuntos
  • #34
javisot said:
But that's only true before the coin is tossed, right?

Before tossing an unbiased coin the probability is 1/2. When we toss it, that probability changes. The environment during the toss continues to change that probability. Finally, the surface where the coin lands also changes the probability completely.

The coin toss is an example where information exists that determines probability and outcomes, not like QM. Even if UP exists and QM is probabilistic, the evolution of probability and state in QM is deterministic.
PeterDonis said:
I don't think this is true. I believe robotic machines have been developed which can flip coins such that one side is more likely to land up than the other. So at the very least, the process by which the coin is flipped has something to do with the probability. And that process is not intrinsic to the coin.
Yes, but my point was that if you don't know or control the details of the environment, then the probability remains 1/2. And this probability is explained by the symmetric shape of the coin itself, even though there is nothing random about the coin itself. Probability and randomness are different things. To determine probability you have to know something, e.g., that the coin has a symmetric shape (i.e. that it is unbiased), or the wave function of the atom. But this by itself does not yet lead to randomness, the randomness requires something unpredictable. In the case of coin, it is quite clear that unpredictability is related to unknown data, related to environment and initial conditions of the coin (the coin initial conditions are not encoded in its shape). In the quantum case it is not so obvious where the randomness comes from, but given the empirical fact that simple quantum systems isolated from the environment do not show random behavior (instead, they obey deterministic unitary evolution), it strongly suggests that randomness in QM might have origin in unknown details of the environment. We do not have a proof that it is true so one might be skeptical, but given that such an explanation of randomness is similar to that in classical physics, it surprises me that more physicists don't find it at least intuitive.

Or to summarize:
- Computing probability is one thing, explaining randomness is another.
- Computing probability requires knowledge of something, explaining randomness requires unpredictability of something.
- The empirical fact that isolated quantum systems do not show randomness indicates that randomness in QM could be related to unpredictability of the environment.
 
Last edited:
  • #35
Demystifier said:
- The empirical fact that isolated quantum systems do not show randomness indicates that randomness in QM could be related to unpredictability of the environment.
I'm back in London, so I'll read your paper today. I'm not convinced that a coin is a good analogy for a quantum system, like a spin 1/2 particle. The difference is state preparation. It's not really the environment that determines a coin toss, but the toss itself. If the coin could be reliably prepared in the same initial state (i.e. tossed in the same way every time), then much of the randomness could disappear. You'll still need a simple environment, like a vacuum and a flat surface perhaps. There's no theoretical reason why a coin toss could not be entirely predictable with a reliable coin-tossing machine.

Whereas, the essence of QM is that we cannot prepare the state of a spin 1/2 particle where the spin about two axes is fully determined. This is fundamental to the mathematics of quantum states. And, IMO, an almost entirely different claim from anything that can be said about a coin. There is no analogy, IMO.

Then the competing claims for the spin 1/2 particle prepared in z-spin-up state are:

1) The randomness of x-spin is inherent in the state preparation (and this is what we mean by inherent randomness). The detection event manifests this randomness - by producing a definite outcome, by some means that is not fully understood.

2) The randomness of x-spin is inherent in the detector. In principle, if we could prepare two detectors reliably in the same state each time, then we could remove the randomness entirely. The first detector would always record z-spin-up for every particle so prepared (the easy bit). The second detector would always record x-spin-up for every particle (the hard/impossible bit).

And, thereby, the UP would have been outflanked! This is starting to remind me of the Einstein-Bohr debates, where a specific attempt to outflank the UP by Einstein by a cleverly designed experiment would be countered by Bohr. Whereas, you have a non-specific detector design that outflanks the UP. But, given there is no design, it's not possible for me or Bohr or anyone else to highlight why it wouldn't work!

I think everyone accepts that we cannot prove that every possible attempt to outflank the UP must fail. It's that it looks that way to such an extent that we accept the UP as valid - unless and until someone does an experiment that disproves it.
 
Last edited:
  • #36
PeroK said:
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic.
This is simply wrong.

A probe of a pure, weakly radioactive, crystalline substance consists in the traditional models of a huge number N of atoms, of which M are not yet decayed, with both M and N only roughly known. The M radioactive atoms are indistinguishable, and so are the N-M decayed atoms. The state of the crystal (the only state that matters) is an N-particle state that is impossible to prepare exactly.

Whatever radiates from the crystal is a deterministic function of the whole N-particle state. Every now and then an atom decays, decreasing M by one. According to the accepted picture of decay, a spherical wave is produced, centered at the position of one of the radioactive atoms in the crystal, with details determined by the whole N-particle state. In particular, the details depend on M. Since M decreased, the next decay has different initial conditions, hence results in different details about this spherical wave, including a different center.

Thus different decays of two indistinguishable radioactive atoms cause distinguishable spherical waves - independent of whether a deterministic or a probabilistic view is taken!
PeroK said:
The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.
Yes, but it is causing where the decay is registered.

The detector is responsible for translating the spherical wave into particle tracks or Geiger counts, and again, this is a complex process depending on the state of the detector - a macroscopic state that is impossible to prepare exactly.
PeroK said:
Only the state of the atom determines the probabilities.
No. The state of ''the atom" doesn't exist since the atoms are part of an N-particle state of two kinds of indistinguishable atoms (radioactive and decayed).

In the ensemble interpretation, the state of the crystal determines (by Born's rule) the probability of decay. In deterministic interpretations, it determines each particular decay.

The only 1-atom information that exists about the radioactive substance is the reduced density operator obtained by the standard methods of statistical mechanics, and it describes the distribution of the ensemble of all radioactive atoms in the crystal together. Nothing about a single atom, and far too little to tell how the crystal behaves!
Demystifier said:
In the paper https://arxiv.org/abs/2010.07575 Sec. 4.3 I have explained why the decay does not depend on details of the environment, creating an illusion that the environment is not needed for the decay, and yet why there is no decay without the environment.
The decay happens at the source (in the crystal), while its manifestation as a particle (track or count) happens in the detector (in your argument part of the environment).

Thus both parts have their share in producing the observed phenomenology.
Demystifier said:
The empirical fact that isolated quantum systems do not show randomness indicates that randomness in QM could be related to unpredictability of the environment.
Or to unpredictability of the source, or both.
 
Last edited:
  • #37
PeroK said:
It's not really the environment that determines a coin toss, but the toss itself.
Fine, but you can think of toss as internal environment. (For the notion of internal environment in QM see e.g. my https://arxiv.org/abs/1406.3221 and references therein.)
It's internal because it's a property of the coin itself (initial positions and velocities of a rigid body), but it's environment because it describes variables that are not determined by the shape of the coin (recall that it is the symmetric shape that determines the probability p=1/2) and cannot be easily controlled (with a precision sufficient for predictability of the outcome).
 
  • #38
A. Neumaier said:
This is simply wrong.
I didn't realise it was impossible to prepare a single radioactive atom and measure its decay time. If so, then I'd need a different example where a sequence of identical one-particle systems can be studied one particle at a time
 
  • #39
A. Neumaier said:
The decay happens at the source (in the crystal), while its manifestation as a particle (track or count) happens in the detector (in your argument part of the environment).
Fine, but what if we isolate one atom out of the crystal? It should not be too difficult to perform an experiment in which a box contains nothing but one atom, while the detector is placed outside of the box. In that case, where does the decay happen, inside or outside the box? I claim outside, or perhaps at the wall of the box, but not inside where there is no environment that can create decoherence.
 
  • #40
PeroK said:
I didn't realise it was impossible to prepare a single radioactive atom and measure its decay time. If so, then I'd need a different example where a sequence of identical one-particle systems can be studied one particle at a time
See my post #39.
 
  • #41
PeroK said:
I didn't realise it was impossible to prepare a single radioactive atom and measure its decay time. If so, then I'd need a different example where a sequence of identical one-particle systems can be studied one particle at a time
My arguments are not specific to a particular setting.

In an ion trap, one can prepare a single ion in an excited state and wait until it is decayed (and this can be ascertained and measured - with some allowance for fidelity issues). From the quantum ensemble mathematics, this is essentially the same process as the decay of a single radioactive atom. However, now the system that determines the details of the decay is the whole ion trap - whose state cannot be prepared exactly. Thus two identical ions prepared sequentially in the same trap still behave differently.

In general, every preparation procedure is macroscopic, and the details of what the preparation produces in each single case depend on the state of the macroscopic source - which cannot be known exactly.

This provides enough opportunity for stochastic features to creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).
 
Last edited:
  • #42
A. Neumaier said:
My arguments are not specific to a particular setting.

In an ion trap, one can prepare a single ion in an excited state and wait until it is decayed (and this can be ascertained and measured - with some allowancce for fidelity issues). From the quantum ensemble mathematics, this is essentially the same process as the decay of a single radioactive atom. However, now the system that determines the details of the decay is the whole ion trap - whose state cannot be prepared exactly. Thus two identical ions prepared sequentially in the same trap still behave differently.

In general, every preparation procedure is macroscopic, and the details of what the preparation produces in each single case depend on the state of the macroscopic source - which cannot be known exactly.

This provides enough opportunity for stochastic features to creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).

If I understand what you're saying, that sounds like the complement of @Demystifier argument. We start with a macroscopic system that produces, in some sense, an isolated, simple system (single electron, single silver atom, single radioactive atom). Standard QM tells us how the state of that simple system evolves. The system is destroyed by detection by another macroscopic system. And we appear to have a single, definite, random outcome.

Now we have three candidate sources of the randomness. The state of the initial macroscopic system (preparation procedure). The evolution of the quantum state. The state of the detector.

Or, as an alternative to the "measurement problem", we have the complementary "preparation problem"?
 
  • #43
Demystifier said:
It should not be too difficult to perform an experiment
not be too difficult???

To hold up your claim, please propose how to set up such an experiment.
Demystifier said:
in which a box contains nothing but one atom, while the detector is placed outside of the box.
Note that you need to make sure in such an experiment that the atom is the only radioactive atom in the box, that it does not touch the boundary, that the radiation produces (a spherical electron wave say) can reach the detector unperturbed,
that the detector responds with near certainty to this wave (producing a detection event) and to nothing but this wave. This seems to require a detector that surrounds the whole box and has an extremely high sensitivity at its whole surface.
Demystifier said:
In that case, where does the decay happen, inside or outside the box?
If the experiment were possible, I think the decay happens in the box, and the detector converts the wave somewhere into a recorded event.
Demystifier said:
I claim outside, or perhaps at the wall of the box itself, but not inside where there is no environment that can create decoherence.
There are the fields needed to ensure that the boundary is not touched.
 
  • #44
If we want to explain the entire set of experimental data that make up QM and we try to do so by violating UP, the result is a hidden-variable theory. A hidden-variable theory cannot reproduce all of QM's predictions; therefore, in principle, without UP, the entire data set should not be explained.

The violation of Bell's inequalities tells us that entanglement exists; certain results cannot be explained by denying UP (although this is not guaranteed, due to possible loopholes).
 
  • #45
PeroK said:
We start with a macroscopic system that produces, in some sense, an isolated, simple system (single electron, single silver atom, single radioactive atom). Standard QM tells us how the state of that simple system evolves. The system is destroyed by detection by another macroscopic system. And we appear to have a single, definite, random outcome.
Yes.
PeroK said:
Now we have three candidate sources of the randomness. The state of the initial macroscopic system (preparation procedure). The evolution of the quantum state. The state of the detector.
The evolution of the radiated system in a homogeneous medium is generally assumed to be unitary, with an effective dynamics depending on the medium. Thus it does not produce any randomness - given the exact dynamics and the exact initial conditions.
PeroK said:
Or, as an alternative to the "measurement problem", we have the complementary "preparation problem"?
Indeed, we have both a
  • quantum preparation problem: ''How does a source produce a single realization of a tiny quantum system?" This problem is hardly discussed in the foundational literature. It is left to pragmatic instrument builders without strong foundational interests,
and a
  • quantum measurement problem: ''How does a detector produce a definite outcome when hit by a tiny quantum system?" This problem is better discussed in the foundational literature, but usually with far too simple models to reveal what happens. Realistic models are again left to pragmatic instrument builders without strong foundational interests.
In both cases, stochastic features creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).
 
Last edited:
  • Informative
  • Like
Likes javisot and PeroK
  • #46
A. Neumaier said:
In both cases, stochastic features creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes? Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it. E.g. the Stern-Gerlach or double-slit or Bell tests.
 
  • #47
A. Neumaier said:
not be too difficult???

To hold up your claim, please propose how to set up such an experiment.
I'm not an experimentalist, but here is a brief sketch. I wouldn't use a trap with forces, I would use a free radioactive particle in an empty box. More precisely, I would first create a good vacuum in a big chamber, and then I would insert a single slow radioactive atom with a very short half time, so that it should decay much before it collides with the wall of the chamber. Don't ask me about the experimental details, but something like that seems doable to me.
 
  • #48
PeroK said:
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes?
Yes. That's the part where interpretations differ.
PeroK said:
Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it. E.g. the Stern-Gerlach or double-slit or Bell tests.
This is because standard QM ignores all preparation issues and detection issues, except for the fact that a desired system is prepared and later is measured. How, is not a problem of quantum mechanics. In the Copenhagen interpretation, it is taken to be a matter of classical or informal description, while in other interpretations, it is handled in other, fundamentally as unaccepticle ways.

Even when you want to study how to build a source or a detector! For in this case, the system of interest is the source or the detector (or a piece of it), and again, the preparation and measurement of this system is given only in terms of a classical or informal description.
 
  • #49
javisot said:
A hidden-variable theory cannot reproduce all of QM's predictions
You mean a local hidden-variable theory cannot reproduce all of QM's predictions. A non-local hidden-variable theory can.
 
  • #50
Demystifier said:
I'm not an experimentalist,
Then you should leave assessments of difficulty to them! Even tests of Bell inequality violations are hard to realize, in the sense that they cannot be tested in a student lab but are very sophisticated.
Demystifier said:
but here is a brief sketch. I wouldn't use a trap with forces, I would use a free radioactive particle in an empty box. More precisely, I would first create a good vacuum in a big chamber, and then I would insert
... how, without destroying the good vacuum?
Demystifier said:
a single slow radioactive atom with a very short half time,
... and how do you ensure that the atom hasn't decayed during the procedure of insertion?
Demystifier said:
so that it should decay much before it collides with the wall of the chamber.
Even should you have achieved this, how do you know the exact state of the atom that figures in the dynamics? You still have a many-body problem with unknown initial state, though now with a smaller number of particles.
 
Back
Top