I Question about discussions around quantum interpretations

  • #51
PeroK said:
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes? Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it.
I already said several times, but I will repeat. Those uncontrollable stochastic features are not important for computing the probabilities of the outcomes. Nevertheless, they may be important for explaining randomness, for otherwise it is hard to explain why simple isolated systems don't show randomness. It is a part of the standard theory that random outcomes only appear when there is decoherence caused by the environment.
 
Physics news on Phys.org
  • #52
Demystifier said:
It is a part of the standard theory
In this case you'd quote (in words) a standard reference for the questionable 'only' part of this claim!
Demystifier said:
that random outcomes only appear when there is decoherence caused by the environment.
 
  • #53
Demystifier said:
I already said several times, but I will repeat. Those uncontrollable stochastic features are not important for computing the probabilities of the outcomes. Nevertheless, they may be important for explaining randomness, for otherwise it is hard to explain why simple isolated systems don't show randomness. It is a part of the standard theory that random outcomes only appear when there is decoherence caused by the environment.
I understand the argument. We possibly risk going round in circles and it becomes a debate about the definition of randomness. Here's an analogy. We put an item on a supermarket shelf with a definite price of $2. (We leave aside the preparation problem in this analogy!) The price evolves so that after one day it is some known distribution of prices from $1 to $3. We take it to the checkout and the price is resolved, into $2.50, say.

Your argument is that it must have been randomness in the checkout process that selected a price from the given distribution. The evolution from a fixed price into a probability distribution does not count as randomness. With that definition, I'm compelled to agree.

But, for me, it's not a satisfactory answer to say that it was all determined until we got to the checkout. I say there already was a bona fide probability distribution in the system before we got to the checkout. The probability distribution evolved - and that is non-determinism. If determinism produces a probability distribution, then it is no longer determinism in the way I would understand it.

And, if we allow the checkout machine to evolve in the same way - into a probability distribution of possible checkout machines, then we cannot tell from the start - by knowing everything about the item on the shelf and everything about the checkout machine - what price will appear at the checkout.

What we can say is that the evolution of the checkout machine doesn't seem to matter, in terms of the specific probabilities of prices that we get. It appears that we only need the probability distribution of the item on the shelf. That's the analogy for standard QM.

Perhaps that is down to interpretation. You can make the maths work either way.
 
  • #54
PeroK said:
If determinism produces a probability distribution, then it is no longer determinism in the way I would understand it.
But mathematicians talk, e.g., about the probability with which a particular digit appears in the deterministic sequence of digits of pi. This is only one example of many of the use of probabilities in deterministic systems. Whenever one has a sensible measure normalized to 1, one has a probability distribution - this has nothing to do with not being deterministic!
 
  • #55
A. Neumaier said:
But mathematicians talk, e.g., about the probability with which a particular digit appears in the deterministic sequence of digits of pi. This is only one example of many of the use of probabilities in deterministic systems. Whenever one has a sensible measure normalized to 1, one has a probability distribution - this has nothing to do with not being deterministic!
Yes, but those are probabilities that can be resolved by more knowledge. Determinism means you can get rid of the probabilities with enough knowledge. The trillionth digit of pi is definitely one of the ten digits, but without calculation it's equally likely to be any of them.
 
  • #56
PeroK said:
I understand the argument.
I'm not sure you do.

PeroK said:
Your argument is that it must have been randomness in the checkout process that selected a price from the given distribution.
No, that's not my argument. I'm saying that complex environment is needed to explain randomness, but this environment does not necessarily need to be related to the checkout process. In this analogy, the price is a result of complex processes on the whole market (e.g. changes of supply and demand in the whole country), and not of random changes of the item on the shelf.

PeroK said:
It appears that we only need the probability distribution of the item on the shelf.
If you are an economist who wants to understand it, then you also want to know something about the processes on the whole market.
 
  • #57
PeroK said:
Yes, but those are probabilities that can be resolved by more knowledge.
Not necessarily - it might be that the question whether these probabilites are all equal (and hence equal 0.1) is undecidable in ZFC.
PeroK said:
The trillionth digit of pi is definitely one of the ten digits,
Note that these probabilities are independent of the knowledge of the first trillion digits!
PeroK said:
but without calculation it's equally likely to be any of them.
So you take ignorance to mean equally likely? This means that your probabilities are subjective probabilities.

But the probability of the digit 0 in the ensemble of all digits of pi is not a matter of guesswork but a matter of mathematical proof - it is either objectively determined or undecidable in ZFC.
 
  • #58
A. Neumaier said:
In this case you'd quote (in words) a standard reference for the questionable 'only' part of this claim!
The "only" part can be derived from the 7 basic rules of QM that you yourself wrote here
https://www.physicsforums.com/insights/the-7-basic-rules-of-quantum-mechanics/
Rule 3 (isolated system evolves deterministically) implies that isolated systems don't behave randomly, so it follows that only open systems (if any) can behave randomly.
 
  • #59
PeroK said:
Yes, but those are probabilities that can be resolved by more knowledge. Determinism means you can get rid of the probabilities with enough knowledge. The trillionth digit of pi is definitely one of the ten digits, but without calculation it's equally likely to be any of them.
I think I see now where is the problem. Many people don't understand very well how randomness arises in classical deterministic mechanics, they find it confusing. So when someone tells them that quantum randomness is a true randomness that does not arise from a classical-like determinism, they see it as a relief, they see quantum randomness as something simpler and more intuitive than classical randomness. That's why they prefer interpretations of QM in which randomness is intrinsic and fundamental. For such people, I would like to note that even classical mechanics can be interpreted as a fundamentally probabilistic theory, very similar to quantum mechanics, except that the corresponding "Schrodinger" equation is not linear: https://arxiv.org/abs/quant-ph/0505143#
 
  • #60
Demystifier said:
The "only" part can be derived from the 7 basic rules of QM that you yourself wrote here
https://www.physicsforums.com/insights/the-7-basic-rules-of-quantum-mechanics/
Rule 3 (isolated system evolves deterministically) implies that isolated systems don't behave randomly,
Rule 3 only states that the state of an isolated system evolves deterministically.
Demystifier said:
so it follows that only open systems (if any) can behave randomly.
It only follows that the state of open system only (if any) can behave randomly.
But this is quite different from your much stronger claim
Demystifier said:
that random outcomes only appear when there is decoherence caused by the environment.
claim! This rule says nothing at all about outcomes , neither it implies that decoherence is necessary for random outcomes (which your statmeent implies).

Rule 3 only states that the state of an isolated system evolves deterministically.

Thus you need to provide better references, or weaken your claim - but in the latter case the conclusions you draw from it are no longer cogent.
 
  • #61
A. Neumaier said:
Rule 3 only states that the state of an isolated system evolves deterministically.
But according to standard QM the state represents a complete description of the system, there is nothing else except the state. Hence the deterministic description of an isolated system by the state is a complete description of an isolated system, at least according to standard QM.
 
  • #62
Demystifier said:
But according to standard QM the state represents a complete description of the system, there is nothing else except the state.
This is not part of the 7 rules - the part common to all interpretations.
It is only part of the Copenhagen interpretation.
 
  • #63
A. Neumaier said:
This is not part of the 7 rules - the part common to all interpretations.
It is only part of the Copenhagen interpretation.
Which is the standard interpretation.

Of course, I am not advocating the standard (Copenhagen) interpretation, but I am pointing out that someone who does should find very plausible the idea that randomness is somehow related to the influence of the environment.
 
Last edited:
  • #64
A. Neumaier said:
Demystifier said:
But according to standard QM the state represents a complete description of the system, there is nothing else except the state.
This is not part of the 7 rules - the part common to all interpretations.
It is only part of the Copenhagen interpretation.
Actually only the first half is part of the Copenhagen interpretation.

The second half is not, since an integral part of the Copenhagen interpretation (not shared by most other interpretations) is that every quantum system must be interpreted in a classical experimental context.

Thus your statement involving outcomes and decoherence has no established source.
Demystifier said:
Which is the standard interpretation.
No. It is the Copenhagen interpretation. It was the standard interpretation until around 1970. Today there is no longer a standard interpretation since a large majority of quantum physicists does not accept one or the other part of the Copenhagen interpretation - in particular the part on the necessity of the classical context.

Only the seven rules are standard.
 
  • #65
A. Neumaier said:
No. It is the Copenhagen interpretation. It was the standard interpretation until around 1970. Today there is no longer a standard interpretation since a large majority of quantum physicists does not accept one or the other part of the Copenhagen interpretation - in particular the part on the necessity of the classical context.
Fair enough. But in that case people like @PeroK should specify which interpretation they have in mind when they argue that environment is not needed for randomness. Because it looks as if he has some Copenhagen-like interpretation in mind, and yet environment-independent randomness does not seem compatible with Copenhagen.
 
  • #66
Demystifier said:
Fair enough. But in that case people like @PeroK should specify which interpretation they have in mind when they argue
All have their own interpretation in mind, unless they specifically mention a particular interpretation. Even in that case, they have their own interpretation of that particular interpretation in mind since no interpretation (not even the Copenhagen interpretation) has a standard version acceptred by all their adherents.

That's the sad state of affairs....
Demystifier said:
Because it looks as if he has some Copenhagen-like interpretation in mind, and yet environment-independent randomness does not seem compatible with Copenhagen.
In the Copenhagen interpretation, both source and detector are part of the classical environment, each of which gives enough scope for hidden randomness. As I understand him, @PeroK only claims detector-independent randomness of decays, not source-independent randomness.
 
  • #67
Demystifier said:
Fair enough. But in that case people like @PeroK should specify which interpretation they have in mind when they argue that environment is not needed for randomness. Because it looks as if he has some Copenhagen-like interpretation in mind, and yet environment-independent randomness does not seem compatible with Copenhagen.
I don't have a specific interpretation in mind. My point is that, generally, the evolution of a superposition can be considered the natural evolution of different, probabilistic outcomes. Take, for example, the idealised electron single-slit experiment.

Before the slit, the electron has a state that resembles a classical particle, with low uncertainty in lateral momentum. The interaction with the slit forces the state to evolve into a superposition of states with varying quantities of lateral momentum. It has evolved (deterministically in the key criteria) into a state with considerable uncertainty in lateral momentum.

Finally, the electron interacts with the detector in a position that appears not to be predictable from the initial state. Despite the uncertainties in the initial state, they do not appear to be sufficient to predetermine the electron to impact at a specific lateral position on the detector. Nor does the configuration of the detector appear to determine which cell is illuminated. The variable interaction with the detector appears to be determined, in standard QM fashion, by the intermediate evolution associated with the interaction with the single slit.

My point is simply that the interaction with the slit alone appears sufficient to produce different, probabilistic outcomes. That is sufficient to class the experiment as non-deterministic.

It may be the case that the states of the source and detector determine the outcome, but it does not seem necessary to consider those. Nor the precise state of the intermediate slit.

Especially, and I labour the point, since the probabilities appear to respect the intermediate evolution of the electron state. No additional mathematics is needed to calculate the required probabilities!
 
  • #68
A. Neumaier said:
As I understand him, @PeroK only claims detector-independent randomness of decays, not source-independent randomness.
I thought I was taking more an experimental view of establishing which components appear to influence the probabilities of the outcomes. In the single-slit it appears that the calculation associated with the intermediate interaction with the slit is sufficient to produce the probabilities that describe the outcome.

I acknowledge we can't have an experiment without a source and detector. But, we can calculate the relevant probabilities without reference to the precise state of either.

I also understand that a deeper analysis (especially in trying to make sense of QM fundamentally) might lead one to consider the precise states of the source and/or detector and/or intermediate slit. And that an interpretation might be constructed where the probabilities of the outcomes, although prima facie independent of the experimental apparatus itself) might indeed be alternatively explained through these considerations.
 
  • #69
PeroK said:
Take, for example, the idealised electron single-slit experiment.

Before the slit, the electron has a state that resembles a classical particle, with low uncertainty in lateral momentum.
It is generally assumed to be in a plane wave state with with low uncertainty in momentum. This is very far from a classical particle.
PeroK said:
The interaction with the slit forces the state to evolve into a superposition of states with varying quantities of lateral momentum.
No. The filter containing the slit turns most of the plane wave into heat, with exception of the little part that passes through the slit.
PeroK said:
It has evolved (deterministically in the key criteria) into a state with considerable uncertainty in lateral momentum.
This part is generally modeled classically (like a wave in classical optics), hence the determinism. Probability is not conserved.

After the slit, the surviving wave is still treated classically, resulting in a spherical wave. At best, the spin degrees of freedom receive a quantum treatment.
PeroK said:
Finally, the electron interacts with the detector in a position that appears not to be predictable from the initial state. Despite the uncertainties in the initial state, they do not appear to be sufficient to predetermine the electron to impact at a specific lateral position on the detector.
This is because a spherical wave has no particle character.
PeroK said:
Nor does the configuration of the detector appear to determine which cell is illuminated.
It appears so to you, without any stringent argument. This is your most questionable assumption.
PeroK said:
The variable interaction with the detector appears to be determined, in standard QM fashion, by the intermediate evolution associated with the interaction with the single slit.
The spatial probability distribution is determined by the squared amplitude at each detector position.
PeroK said:
My point is simply that the interaction with the slit alone appears sufficient to produce different, probabilistic outcomes.
The slit only produces the spherical wave, nothing more, which defines a mathematical probability distribution, but no actual outcomes, hence no randomness - the spherical symmetry is preserved before meeting the detector.

But the outcomes are produced by the detector, according to this distribution. The observed emerging detection pattern is fully consistent with the assumption that each local neighborhood of the detector responds independently with a tiny probability proportionally to the tiny strength of the impacting wave at that neighborhood. Thus it is natural to assign the randomness to to the unknown details of each neighborhood.
PeroK said:
That is sufficient to class the experiment as non-deterministic.
But not sufficient to pinpoint the reason.
PeroK said:
It may be the case that the states of the source and detector determine the outcome, but it does not seem necessary to consider those. Nor the precise state of the intermediate slit.
So how do the mere probabilities create the outcomes?
PeroK said:
Especially, and I labour the point, since the probabilities appear to respect the intermediate evolution of the electron state.
Only after the slit and before the detector. There probabilities (or rather probability amplitudes) evolve deterministically.
PeroK said:
No additional mathematics is needed to calculate the required probabilities!
But without additional mathematics you don't get any outcomes, hence no randomness.
PeroK said:
I thought I was taking more an experimental view of establishing which components appear to influence the probabilities of the outcomes. In the single-slit it appears that the calculation associated with the intermediate interaction with the slit is sufficient to produce the probabilities that describe the outcome.
There is a difference between producing probabilities and producing outcomes - this is what @Demystifier repeatedly pointed out. An outcome is an actual change in the detector, but a probability distribution is just an idea in our mind, and cannot cause such an outcome. While the interaction with the detector can and does!
PeroK said:
I acknowledge we can't have an experiment without a source and detector. But, we can calculate the relevant probabilities without reference to the precise state of either.
This is just shut-up-and-calculate, about which there was never any dispute. But calculations have no causal power, only interactions have!
 
Last edited:
  • #70
A. Neumaier said:
There is a difference between producing probabilities and producing outcomes - this is what @Demystifier repeatedly pointed out. An outcome is an actual change in the detector, but a probability distribution is just an idea in our mind, and cannot cause such an outcome. While the interaction with the detector can and does!
I don't agree with that at all. And I don't know how many physicists would. In all of physics you do calculations based on certain criteria (whether that's wave mechanics or Feynman diagrams or simple parabolic motion). These calculations produce the predictions of experiments. The predictions are generally not produced by an analysis of the interaction at the detection event.

Feynman diagrams are just abstract calculations. But, that's how scattering cross sections are calculated. Not from a detailed analysis of the state of the detector.

You're a heavyweight physicist and I'm a rank amateur. But, I know enough to be skeptical of what you are saying.
 
  • #71
PeroK said:
In all of physics you do calculations based on certain criteria (whether that's wave mechanics or Feynman diagrams or simple parabolic motion). These calculations produce the predictions of experiments. The predictions are generally not produced by an analysis of the interaction at the detection event.

Feynman diagrams are just abstract calculations. But, that's how scattering cross sections are calculated.
I fully agree with that. But it has nothing to do with what I wrote.

Calculations produce the predictions. But they have no causal physical power.

What happens is not a result of calculations with models of physics, but of interactions of actual physical systems!

To produce a decay (and hence a system to be measured) you need to have a real source whose interactions produce the decay. This is what @Demystifier tries to deny.

And to produce an outcome (and hence a set of random numbers to be compared with probabilistic predictions) you need to have a real detector whose interactions produce the outcome. This is what you try to deny.

No amount of calculated probabilities can change these two general observations.
 
  • #72
PeroK said:
My point is that, generally, the evolution of a superposition can be considered the natural evolution of different, probabilistic outcomes.
Even this is interpretation dependent. In the MWI, for example, the unitary evolution of the state is completely deterministic, even including measuring devices, since those just get included in entangled superpositions as measurement interactions occur. There is no randomness or probabilistic outcome anywhere.
 
Back
Top