I Question about discussions around quantum interpretations

  • #31
Demystifier said:
3. The analogy with coin tossing is useful again. From the symmetric shape of the coin itself we can conclude that p(head)=p(tail)=1/2.
But that's only true before the coin is tossed, right?

Before tossing an unbiased coin the probability is 1/2. When we toss it, that probability changes. The environment during the toss continues to change that probability. Finally, the surface where the coin lands also changes the probability completely.

The coin toss is an example where information exists that determines probability and outcomes, not like QM. Even if UP exists and QM is probabilistic, the evolution of probability and state in QM is deterministic.
 
Physics news on Phys.org
  • #32
Thread closed for moderation.
 
  • Like
Likes weirdoguy and ojitojuntos
  • #33
After some cleanup, the thread is reopened.
 
  • Like
Likes weirdoguy and ojitojuntos
  • #34
javisot said:
But that's only true before the coin is tossed, right?

Before tossing an unbiased coin the probability is 1/2. When we toss it, that probability changes. The environment during the toss continues to change that probability. Finally, the surface where the coin lands also changes the probability completely.

The coin toss is an example where information exists that determines probability and outcomes, not like QM. Even if UP exists and QM is probabilistic, the evolution of probability and state in QM is deterministic.
PeterDonis said:
I don't think this is true. I believe robotic machines have been developed which can flip coins such that one side is more likely to land up than the other. So at the very least, the process by which the coin is flipped has something to do with the probability. And that process is not intrinsic to the coin.
Yes, but my point was that if you don't know or control the details of the environment, then the probability remains 1/2. And this probability is explained by the symmetric shape of the coin itself, even though there is nothing random about the coin itself. Probability and randomness are different things. To determine probability you have to know something, e.g., that the coin has a symmetric shape (i.e. that it is unbiased), or the wave function of the atom. But this by itself does not yet lead to randomness, the randomness requires something unpredictable. In the case of coin, it is quite clear that unpredictability is related to unknown data, related to environment and initial conditions of the coin (the coin initial conditions are not encoded in its shape). In the quantum case it is not so obvious where the randomness comes from, but given the empirical fact that simple quantum systems isolated from the environment do not show random behavior (instead, they obey deterministic unitary evolution), it strongly suggests that randomness in QM might have origin in unknown details of the environment. We do not have a proof that it is true so one might be skeptical, but given that such an explanation of randomness is similar to that in classical physics, it surprises me that more physicists don't find it at least intuitive.

Or to summarize:
- Computing probability is one thing, explaining randomness is another.
- Computing probability requires knowledge of something, explaining randomness requires unpredictability of something.
- The empirical fact that isolated quantum systems do not show randomness indicates that randomness in QM could be related to unpredictability of the environment.
 
Last edited:
  • #35
Demystifier said:
- The empirical fact that isolated quantum systems do not show randomness indicates that randomness in QM could be related to unpredictability of the environment.
I'm back in London, so I'll read your paper today. I'm not convinced that a coin is a good analogy for a quantum system, like a spin 1/2 particle. The difference is state preparation. It's not really the environment that determines a coin toss, but the toss itself. If the coin could be reliably prepared in the same initial state (i.e. tossed in the same way every time), then much of the randomness could disappear. You'll still need a simple environment, like a vacuum and a flat surface perhaps. There's no theoretical reason why a coin toss could not be entirely predictable with a reliable coin-tossing machine.

Whereas, the essence of QM is that we cannot prepare the state of a spin 1/2 particle where the spin about two axes is fully determined. This is fundamental to the mathematics of quantum states. And, IMO, an almost entirely different claim from anything that can be said about a coin. There is no analogy, IMO.

Then the competing claims for the spin 1/2 particle prepared in z-spin-up state are:

1) The randomness of x-spin is inherent in the state preparation (and this is what we mean by inherent randomness). The detection event manifests this randomness - by producing a definite outcome, by some means that is not fully understood.

2) The randomness of x-spin is inherent in the detector. In principle, if we could prepare two detectors reliably in the same state each time, then we could remove the randomness entirely. The first detector would always record z-spin-up for every particle so prepared (the easy bit). The second detector would always record x-spin-up for every particle (the hard/impossible bit).

And, thereby, the UP would have been outflanked! This is starting to remind me of the Einstein-Bohr debates, where a specific attempt to outflank the UP by Einstein by a cleverly designed experiment would be countered by Bohr. Whereas, you have a non-specific detector design that outflanks the UP. But, given there is no design, it's not possible for me or Bohr or anyone else to highlight why it wouldn't work!

I think everyone accepts that we cannot prove that every possible attempt to outflank the UP must fail. It's that it looks that way to such an extent that we accept the UP as valid - unless and until someone does an experiment that disproves it.
 
Last edited:
  • Like
Likes bhobba and martinbn
  • #36
PeroK said:
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic.
This is simply wrong.

A probe of a pure, weakly radioactive, crystalline substance consists in the traditional models of a huge number N of atoms, of which M are not yet decayed, with both M and N only roughly known. The M radioactive atoms are indistinguishable, and so are the N-M decayed atoms. The state of the crystal (the only state that matters) is an N-particle state that is impossible to prepare exactly.

Whatever radiates from the crystal is a deterministic function of the whole N-particle state. Every now and then an atom decays, decreasing M by one. According to the accepted picture of decay, a spherical wave is produced, centered at the position of one of the radioactive atoms in the crystal, with details determined by the whole N-particle state. In particular, the details depend on M. Since M decreased, the next decay has different initial conditions, hence results in different details about this spherical wave, including a different center.

Thus different decays of two indistinguishable radioactive atoms cause distinguishable spherical waves - independent of whether a deterministic or a probabilistic view is taken!
PeroK said:
The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.
Yes, but it is causing where the decay is registered.

The detector is responsible for translating the spherical wave into particle tracks or Geiger counts, and again, this is a complex process depending on the state of the detector - a macroscopic state that is impossible to prepare exactly.
PeroK said:
Only the state of the atom determines the probabilities.
No. The state of ''the atom" doesn't exist since the atoms are part of an N-particle state of two kinds of indistinguishable atoms (radioactive and decayed).

In the ensemble interpretation, the state of the crystal determines (by Born's rule) the probability of decay. In deterministic interpretations, it determines each particular decay.

The only 1-atom information that exists about the radioactive substance is the reduced density operator obtained by the standard methods of statistical mechanics, and it describes the distribution of the ensemble of all radioactive atoms in the crystal together. Nothing about a single atom, and far too little to tell how the crystal behaves!
Demystifier said:
In the paper https://arxiv.org/abs/2010.07575 Sec. 4.3 I have explained why the decay does not depend on details of the environment, creating an illusion that the environment is not needed for the decay, and yet why there is no decay without the environment.
The decay happens at the source (in the crystal), while its manifestation as a particle (track or count) happens in the detector (in your argument part of the environment).

Thus both parts have their share in producing the observed phenomenology.
Demystifier said:
The empirical fact that isolated quantum systems do not show randomness indicates that randomness in QM could be related to unpredictability of the environment.
Or to unpredictability of the source, or both.
 
Last edited:
  • Like
Likes PeterDonis, bhobba and javisot
  • #37
PeroK said:
It's not really the environment that determines a coin toss, but the toss itself.
Fine, but you can think of toss as internal environment. (For the notion of internal environment in QM see e.g. my https://arxiv.org/abs/1406.3221 and references therein.)
It's internal because it's a property of the coin itself (initial positions and velocities of a rigid body), but it's environment because it describes variables that are not determined by the shape of the coin (recall that it is the symmetric shape that determines the probability p=1/2) and cannot be easily controlled (with a precision sufficient for predictability of the outcome).
 
  • #38
A. Neumaier said:
This is simply wrong.
I didn't realise it was impossible to prepare a single radioactive atom and measure its decay time. If so, then I'd need a different example where a sequence of identical one-particle systems can be studied one particle at a time
 
  • Like
Likes ojitojuntos
  • #39
A. Neumaier said:
The decay happens at the source (in the crystal), while its manifestation as a particle (track or count) happens in the detector (in your argument part of the environment).
Fine, but what if we isolate one atom out of the crystal? It should not be too difficult to perform an experiment in which a box contains nothing but one atom, while the detector is placed outside of the box. In that case, where does the decay happen, inside or outside the box? I claim outside, or perhaps at the wall of the box, but not inside where there is no environment that can create decoherence.
 
  • #40
PeroK said:
I didn't realise it was impossible to prepare a single radioactive atom and measure its decay time. If so, then I'd need a different example where a sequence of identical one-particle systems can be studied one particle at a time
See my post #39.
 
  • #41
PeroK said:
I didn't realise it was impossible to prepare a single radioactive atom and measure its decay time. If so, then I'd need a different example where a sequence of identical one-particle systems can be studied one particle at a time
My arguments are not specific to a particular setting.

In an ion trap, one can prepare a single ion in an excited state and wait until it is decayed (and this can be ascertained and measured - with some allowance for fidelity issues). From the quantum ensemble mathematics, this is essentially the same process as the decay of a single radioactive atom. However, now the system that determines the details of the decay is the whole ion trap - whose state cannot be prepared exactly. Thus two identical ions prepared sequentially in the same trap still behave differently.

In general, every preparation procedure is macroscopic, and the details of what the preparation produces in each single case depend on the state of the macroscopic source - which cannot be known exactly.

This provides enough opportunity for stochastic features to creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).
 
Last edited:
  • Like
Likes PeterDonis and bhobba
  • #42
A. Neumaier said:
My arguments are not specific to a particular setting.

In an ion trap, one can prepare a single ion in an excited state and wait until it is decayed (and this can be ascertained and measured - with some allowancce for fidelity issues). From the quantum ensemble mathematics, this is essentially the same process as the decay of a single radioactive atom. However, now the system that determines the details of the decay is the whole ion trap - whose state cannot be prepared exactly. Thus two identical ions prepared sequentially in the same trap still behave differently.

In general, every preparation procedure is macroscopic, and the details of what the preparation produces in each single case depend on the state of the macroscopic source - which cannot be known exactly.

This provides enough opportunity for stochastic features to creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).

If I understand what you're saying, that sounds like the complement of @Demystifier argument. We start with a macroscopic system that produces, in some sense, an isolated, simple system (single electron, single silver atom, single radioactive atom). Standard QM tells us how the state of that simple system evolves. The system is destroyed by detection by another macroscopic system. And we appear to have a single, definite, random outcome.

Now we have three candidate sources of the randomness. The state of the initial macroscopic system (preparation procedure). The evolution of the quantum state. The state of the detector.

Or, as an alternative to the "measurement problem", we have the complementary "preparation problem"?
 
  • Like
Likes ojitojuntos, bhobba and gentzen
  • #43
Demystifier said:
It should not be too difficult to perform an experiment
not be too difficult???

To hold up your claim, please propose how to set up such an experiment.
Demystifier said:
in which a box contains nothing but one atom, while the detector is placed outside of the box.
Note that you need to make sure in such an experiment that the atom is the only radioactive atom in the box, that it does not touch the boundary, that the radiation produces (a spherical electron wave say) can reach the detector unperturbed,
that the detector responds with near certainty to this wave (producing a detection event) and to nothing but this wave. This seems to require a detector that surrounds the whole box and has an extremely high sensitivity at its whole surface.
Demystifier said:
In that case, where does the decay happen, inside or outside the box?
If the experiment were possible, I think the decay happens in the box, and the detector converts the wave somewhere into a recorded event.
Demystifier said:
I claim outside, or perhaps at the wall of the box itself, but not inside where there is no environment that can create decoherence.
There are the fields needed to ensure that the boundary is not touched.
 
  • Like
Likes PeterDonis and dextercioby
  • #44
If we want to explain the entire set of experimental data that make up QM and we try to do so by violating UP, the result is a hidden-variable theory. A hidden-variable theory cannot reproduce all of QM's predictions; therefore, in principle, without UP, the entire data set should not be explained.

The violation of Bell's inequalities tells us that entanglement exists; certain results cannot be explained by denying UP (although this is not guaranteed, due to possible loopholes).
 
  • #45
PeroK said:
We start with a macroscopic system that produces, in some sense, an isolated, simple system (single electron, single silver atom, single radioactive atom). Standard QM tells us how the state of that simple system evolves. The system is destroyed by detection by another macroscopic system. And we appear to have a single, definite, random outcome.
Yes.
PeroK said:
Now we have three candidate sources of the randomness. The state of the initial macroscopic system (preparation procedure). The evolution of the quantum state. The state of the detector.
The evolution of the radiated system in a homogeneous medium is generally assumed to be unitary, with an effective dynamics depending on the medium. Thus it does not produce any randomness - given the exact dynamics and the exact initial conditions.
PeroK said:
Or, as an alternative to the "measurement problem", we have the complementary "preparation problem"?
Indeed, we have both a
  • quantum preparation problem: ''How does a source produce a single realization of a tiny quantum system?" This problem is hardly discussed in the foundational literature. It is left to pragmatic instrument builders without strong foundational interests,
and a
  • quantum measurement problem: ''How does a detector produce a definite outcome when hit by a tiny quantum system?" This problem is better discussed in the foundational literature, but usually with far too simple models to reveal what happens. Realistic models are again left to pragmatic instrument builders without strong foundational interests.
In both cases, stochastic features creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).
 
Last edited:
  • Like
  • Informative
Likes PeterDonis, gentzen, javisot and 1 other person
  • #46
A. Neumaier said:
In both cases, stochastic features creep in due to lack of knowledge (for epistemic interpretations) or chaotic sensitivity to initial conditions (for deterministic interpretations).
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes? Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it. E.g. the Stern-Gerlach or double-slit or Bell tests.
 
  • #47
A. Neumaier said:
not be too difficult???

To hold up your claim, please propose how to set up such an experiment.
I'm not an experimentalist, but here is a brief sketch. I wouldn't use a trap with forces, I would use a free radioactive particle in an empty box. More precisely, I would first create a good vacuum in a big chamber, and then I would insert a single slow radioactive atom with a very short half time, so that it should decay much before it collides with the wall of the chamber. Don't ask me about the experimental details, but something like that seems doable to me.
 
  • #48
PeroK said:
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes?
Yes. That's the part where interpretations differ.
PeroK said:
Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it. E.g. the Stern-Gerlach or double-slit or Bell tests.
This is because standard QM ignores all preparation issues and detection issues, except for the fact that a desired system is prepared and later is measured. How, is not a problem of quantum mechanics. In the Copenhagen interpretation, it is taken to be a matter of classical or informal description, while in other interpretations, it is handled in other, fundamentally as unaccepticle ways.

Even when you want to study how to build a source or a detector! For in this case, the system of interest is the source or the detector (or a piece of it), and again, the preparation and measurement of this system is given only in terms of a classical or informal description.
 
  • Like
Likes Fra and PeterDonis
  • #49
javisot said:
A hidden-variable theory cannot reproduce all of QM's predictions
You mean a local hidden-variable theory cannot reproduce all of QM's predictions. A non-local hidden-variable theory can.
 
  • Like
Likes physika, PeterDonis, ojitojuntos and 1 other person
  • #50
Demystifier said:
I'm not an experimentalist,
Then you should leave assessments of difficulty to them! Even tests of Bell inequality violations are hard to realize, in the sense that they cannot be tested in a student lab but are very sophisticated.
Demystifier said:
but here is a brief sketch. I wouldn't use a trap with forces, I would use a free radioactive particle in an empty box. More precisely, I would first create a good vacuum in a big chamber, and then I would insert
... how, without destroying the good vacuum?
Demystifier said:
a single slow radioactive atom with a very short half time,
... and how do you ensure that the atom hasn't decayed during the procedure of insertion?
Demystifier said:
so that it should decay much before it collides with the wall of the chamber.
Even should you have achieved this, how do you know the exact state of the atom that figures in the dynamics? You still have a many-body problem with unknown initial state, though now with a smaller number of particles.
 
  • #51
PeroK said:
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes? Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it.
I already said several times, but I will repeat. Those uncontrollable stochastic features are not important for computing the probabilities of the outcomes. Nevertheless, they may be important for explaining randomness, for otherwise it is hard to explain why simple isolated systems don't show randomness. It is a part of the standard theory that random outcomes only appear when there is decoherence caused by the environment.
 
  • #52
Demystifier said:
It is a part of the standard theory
In this case you'd quote (in words) a standard reference for the questionable 'only' part of this claim!
Demystifier said:
that random outcomes only appear when there is decoherence caused by the environment.
 
  • #53
Demystifier said:
I already said several times, but I will repeat. Those uncontrollable stochastic features are not important for computing the probabilities of the outcomes. Nevertheless, they may be important for explaining randomness, for otherwise it is hard to explain why simple isolated systems don't show randomness. It is a part of the standard theory that random outcomes only appear when there is decoherence caused by the environment.
I understand the argument. We possibly risk going round in circles and it becomes a debate about the definition of randomness. Here's an analogy. We put an item on a supermarket shelf with a definite price of $2. (We leave aside the preparation problem in this analogy!) The price evolves so that after one day it is some known distribution of prices from $1 to $3. We take it to the checkout and the price is resolved, into $2.50, say.

Your argument is that it must have been randomness in the checkout process that selected a price from the given distribution. The evolution from a fixed price into a probability distribution does not count as randomness. With that definition, I'm compelled to agree.

But, for me, it's not a satisfactory answer to say that it was all determined until we got to the checkout. I say there already was a bona fide probability distribution in the system before we got to the checkout. The probability distribution evolved - and that is non-determinism. If determinism produces a probability distribution, then it is no longer determinism in the way I would understand it.

And, if we allow the checkout machine to evolve in the same way - into a probability distribution of possible checkout machines, then we cannot tell from the start - by knowing everything about the item on the shelf and everything about the checkout machine - what price will appear at the checkout.

What we can say is that the evolution of the checkout machine doesn't seem to matter, in terms of the specific probabilities of prices that we get. It appears that we only need the probability distribution of the item on the shelf. That's the analogy for standard QM.

Perhaps that is down to interpretation. You can make the maths work either way.
 
  • #54
PeroK said:
If determinism produces a probability distribution, then it is no longer determinism in the way I would understand it.
But mathematicians talk, e.g., about the probability with which a particular digit appears in the deterministic sequence of digits of pi. This is only one example of many of the use of probabilities in deterministic systems. Whenever one has a sensible measure normalized to 1, one has a probability distribution - this has nothing to do with not being deterministic!
 
  • #55
A. Neumaier said:
But mathematicians talk, e.g., about the probability with which a particular digit appears in the deterministic sequence of digits of pi. This is only one example of many of the use of probabilities in deterministic systems. Whenever one has a sensible measure normalized to 1, one has a probability distribution - this has nothing to do with not being deterministic!
Yes, but those are probabilities that can be resolved by more knowledge. Determinism means you can get rid of the probabilities with enough knowledge. The trillionth digit of pi is definitely one of the ten digits, but without calculation it's equally likely to be any of them.
 
  • #56
PeroK said:
I understand the argument.
I'm not sure you do.

PeroK said:
Your argument is that it must have been randomness in the checkout process that selected a price from the given distribution.
No, that's not my argument. I'm saying that complex environment is needed to explain randomness, but this environment does not necessarily need to be related to the checkout process. In this analogy, the price is a result of complex processes on the whole market (e.g. changes of supply and demand in the whole country), and not of random changes of the item on the shelf.

PeroK said:
It appears that we only need the probability distribution of the item on the shelf.
If you are an economist who wants to understand it, then you also want to know something about the processes on the whole market.
 
  • #57
PeroK said:
Yes, but those are probabilities that can be resolved by more knowledge.
Not necessarily - it might be that the question whether these probabilites are all equal (and hence equal 0.1) is undecidable in ZFC.
PeroK said:
The trillionth digit of pi is definitely one of the ten digits,
Note that these probabilities are independent of the knowledge of the first trillion digits!
PeroK said:
but without calculation it's equally likely to be any of them.
So you take ignorance to mean equally likely? This means that your probabilities are subjective probabilities.

But the probability of the digit 0 in the ensemble of all digits of pi is not a matter of guesswork but a matter of mathematical proof - it is either objectively determined or undecidable in ZFC.
 
  • #58
A. Neumaier said:
In this case you'd quote (in words) a standard reference for the questionable 'only' part of this claim!
The "only" part can be derived from the 7 basic rules of QM that you yourself wrote here
https://www.physicsforums.com/insights/the-7-basic-rules-of-quantum-mechanics/
Rule 3 (isolated system evolves deterministically) implies that isolated systems don't behave randomly, so it follows that only open systems (if any) can behave randomly.
 
  • #59
PeroK said:
Yes, but those are probabilities that can be resolved by more knowledge. Determinism means you can get rid of the probabilities with enough knowledge. The trillionth digit of pi is definitely one of the ten digits, but without calculation it's equally likely to be any of them.
I think I see now where is the problem. Many people don't understand very well how randomness arises in classical deterministic mechanics, they find it confusing. So when someone tells them that quantum randomness is a true randomness that does not arise from a classical-like determinism, they see it as a relief, they see quantum randomness as something simpler and more intuitive than classical randomness. That's why they prefer interpretations of QM in which randomness is intrinsic and fundamental. For such people, I would like to note that even classical mechanics can be interpreted as a fundamentally probabilistic theory, very similar to quantum mechanics, except that the corresponding "Schrodinger" equation is not linear: https://arxiv.org/abs/quant-ph/0505143#
 
  • Informative
  • Like
Likes physika and PeroK
  • #60
Demystifier said:
The "only" part can be derived from the 7 basic rules of QM that you yourself wrote here
https://www.physicsforums.com/insights/the-7-basic-rules-of-quantum-mechanics/
Rule 3 (isolated system evolves deterministically) implies that isolated systems don't behave randomly,
Rule 3 only states that the state of an isolated system evolves deterministically.
Demystifier said:
so it follows that only open systems (if any) can behave randomly.
It only follows that the state of open system only (if any) can behave randomly.
But this is quite different from your much stronger claim
Demystifier said:
that random outcomes only appear when there is decoherence caused by the environment.
claim! This rule says nothing at all about outcomes , neither it implies that decoherence is necessary for random outcomes (which your statmeent implies).

Rule 3 only states that the state of an isolated system evolves deterministically.

Thus you need to provide better references, or weaken your claim - but in the latter case the conclusions you draw from it are no longer cogent.
 

Similar threads

Replies
45
Views
7K
Replies
6
Views
2K
  • · Replies 109 ·
4
Replies
109
Views
10K
Replies
52
Views
6K
Replies
35
Views
755
Replies
14
Views
3K
  • · Replies 37 ·
2
Replies
37
Views
6K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 25 ·
Replies
25
Views
5K