I Question about discussions around quantum interpretations

  • #121
ojitojuntos said:
If randomness arises at measurement, and we can’t pinpoint how the collapse occurs, why is it that most physicists (according to polls I’ve seen online) consider that reality is fundamentally probabilistic, instead of deterministic, but with epistemic uncertainty?
Is this correct? Or, in a simpler sense, do experimental results and the math seem to lean more towards fundamental probabilities?
Yes, this is the basic message of when experiment and QM violates Bells inqeuality.

Given some assumptions (that follow from a basic classical picture), if the results are only epistemic (in the sense of beeing due to physicists ignorance) then the inequality must hold - but it doesn't! That is the problem with the idea "deterministic, but with epistemic uncertainty".

/Fredrik
 
  • Like
Likes ojitojuntos and gentzen
Physics news on Phys.org
  • #122
Fra said:
Given some assumptions (that follow from a basic classical picture), if the results are only epistemic (in the sense of beeing due to physicists ignorance) then the inequality must hold
But there are QM interpretations where the results are epistemic and the assumptions you refer to are violated. For example, the Bohmian interpretation, in which the probabilities are purely due to our ignorance of the actual particle positions.
 
  • #123
Fra said:
this is the basic message of when experiment and QM violates Bells inqeuality.
Not quite. It's true that "deterministic, but with epistemic uncertainty" forces you to accept something like the Bohmian interpretation.

But "fundamentally probabilistic" forces you to accept that even though there is no way even in principle to predict in advance what the experimental results will be, the results for entangled particles measured at distant locations still have to obey the constraints imposed by the overall quantum state of the system. For example, measurements of spin around the same axis on two entangled qubits in the singlet state will always give opposite results. That always is what makes it very hard to see how a "fundamentally probabilistic" underlying physics could work--how could it possibly guarantee such a result every time?

In short, the real "basic message of when experiment and QM violates Bells inqeuality" is that nobody has a good intuitive picture of what's going on. There is no interpretation that doesn't force you to accept something that seems deeply problematic.
 
  • #124
ojitojuntos said:
Hello guys. OP again. I appreciate the discussion and thorough explanations for a layman. I have one more question about the measurement problem:
If randomness arises at measurement, and we can’t pinpoint how the collapse occurs, why is it that most physicists (according to polls I’ve seen online) consider that reality is fundamentally probabilistic, instead of deterministic, but with epistemic uncertainty?
Is this correct? Or, in a simpler sense, do experimental results and the math seem to lean more towards fundamental probabilities?
This thread has highlighted that ultimately it is perhaps a matter of personal taste whether "nature is fundamentally probabilistic" or not. Let's go back to the simple example of a single radioactive atom. Taken at face value, there is nothing in the description of the atomic state that determines when the atom will decay. That is nature being probabilistic at the fundamental level.

However, saying that the atom decays entails the complication of measurement and a suitable measurement apparatus. And, you could claim that if the state of everything in the experiment was known, then you would know in advance when the atom was measured to decay. And, no one can disprove this claim.

Moreover, given the complexity of a macroscopic measurement device, it's practically (and perhaps even theoretically) impossible to know its precise state. You would need to start by measuring the measuring device - entailing a much more extensive measurement problem.

I can't speak for professional physicists, but my instinct is to accept the first (probabiltistic) analysis. It feels closer to what nature is telling us. The second analysis seems to impose our thinking on nature. That, ultimately, no matter how loudly nature appears to be telling us that it's fundamentally probabiltistic, we appeal to an inately human demand for determinism to explain away the apparent probabilities. And demand that under it all there is actually pure determinism at work.
 
  • Like
Likes Sambuco, ojitojuntos, martinbn and 1 other person
  • #125
PeterDonis said:
Not quite. It's true that "deterministic, but with epistemic uncertainty" forces you to accept something like the Bohmian interpretation.
PeterDonis said:
In short, the real "basic message of when experiment and QM violates Bells inqeuality" is that nobody has a good intuitive picture of what's going on. There is no interpretation that doesn't force you to accept something that seems deeply problematic.
Fair enough, but I didn't even count Bohmian mechanics, as it introduces so many new issues that are far more impalatable to me that the original problem, seeking increasingly more "improbable" loopholes etc :nb)

I only pay attention to it when Demystifier has a "bad day" and presents it like this.

/Fredrik
 
  • #126
Fra said:
I didn't even count Bohmian mechanics, as it introduces so many new issues that are far more impalatable to me that the original problem
As I said, every QM interpretation has features that are unpalatable. It's just a question of what kinds of unpalatability you prefer to accept.
 
  • #127
ojitojuntos said:
Hello guys. OP again. I appreciate the discussion and thorough explanations for a layman. I have one more question about the measurement problem:
If randomness arises at measurement, and we can’t pinpoint how the collapse occurs, why is it that most physicists (according to polls I’ve seen online) consider that reality is fundamentally probabilistic, instead of deterministic, but with epistemic uncertainty?
Is this correct? Or, in a simpler sense, do experimental results and the math seem to lean more towards fundamental probabilities?
The following text excerpt may be helpful (from: Rudolphina - Universität Wien https://rudolphina.univie.ac.at/en/quantum-physics-demands-a-new-understanding-of-reality):

What is the difference between probabilistic and deterministic descriptions?

Classical physics describes the world in a deterministic way—which means that we can predict outcomes by thinking about how events would certainly unfold under ideal conditions. Probabilistic descriptions, on the other hand, are only able to say how probable a given measured result is. Classical physics also works with probabilistic descriptions, "but these are merely an expression of the fact that we do not know the true circumstances," says Časlav Brukner. In other words, probabilities in classical physics merely reflect our ignorance. "Yet this is not the case in quantum physics, where probabilities occur in a fundamental, non-reducible way—there is no deterministic cause behind them." This means that the world, at its core, is indeterminate.
 
  • #128
Fra said:
Fair enough, but I didn't even count Bohmian mechanics, as it introduces so many new issues that are far more impalatable to me that the original problem, seeking increasingly more "improbable" loopholes etc :nb)

I only pay attention to it when Demystifier has a "bad day" and presents it like this.

/Fredrik
Which features of Bohm do you find impalatable, out of interest?
 
  • #129
PeterDonis said:
There is no interpretation that doesn't force you to accept something that seems deeply problematic.

And many, like me, think that is because we have no DIRECT experience with the quantum world. It is tough to apply how we think about the world, shaped by experience in the macro world, to something we have no experience of.

Even further, the Effective Field Theory approach to QFT seems to be saying, accepting principles such as cluster decomposition, concepts rooted in direct experience of the everyday world, for regions we can currently probe, (ie including the no direct experience part) ,things can't be other than what QM says:

https://en.wikipedia.org/wiki/Effective_field_theory
Steven Weinberg's "folk theorem" stipulates how to build an effective field theory that is well behaved. The "theorem" states that the most general Lagrangian that is consistent with the symmetries of the low energy theory can be rendered into an effective field theory at low energies that respects the symmetries and respects unitarity, analyticity, and cluster decomposition.

Its physical content seems to be in abstract mathematical concepts like symmetry and constants that must be put in by hand - we have zero idea where they come from (as of now)

What was the title of that crazy 1963 movie - It's a Mad, Mad, Mad, Mad World.

As far as meaning goes, even the so-called Measurement problem is still debated after all these years:


Thanks
Bill
 
  • #130
bhobba said:
many, like me, think that is because we have no DIRECT experience with the quantum world
It's not just that we have no direct experience with the quantum world. It's that we do have tons of direct experience of a world that is not quantum--that behaves classically. Yet there doesn't seem to be any way to get that classical world out of QM without having to believe something very unpalatable.

bhobba said:
the Effective Field Theory approach to QFT seems to be saying, accepting principles such as cluster decomposition, concepts rooted in direct experience of the everyday world, for regions we can currently probe, (ie including the no direct experience part) ,things can't be other than what QM says
Only if you interpret those principles very loosely.

For example, take "cluster decomposition". There are different statements of it in the literature, but basically it boils down to, distant experiments can't influence each other. Which sounds fine until you run into Bell inequality violations and other counterintuitive results of experiments on entangled particles--for example, if you and I each measure one of a pair of entangled qubits in the singlet state, you're on Earth and I'm on a planet circling Alpha Centauri, and our measurements are both along the same spin axis, we must get opposite results. How in tarnation can that happen if the measurements can't influence each other?

The usual answer involves words like "nonlocality", but that's not actually an answer, it's just a restatement of the problem. Nobody has a good answer to how it can be like that. We have a very good answer as to what theoretical framework to use to make predictions--yes, at sufficiently low energies, that's going to be a QFT of one form or another, as the effective field theory approach says. But as for what's going on "under the hood" that makes those QFT predictions work out? Nobody has a good answer.
 
  • Love
  • Like
Likes javisot and bhobba
  • #131
iste said:
Which features of Bohm do you find impalatable, out of interest?
One issue is going from basic non-relativistic QM to QFT.

Our own Demystifier examines some of the issues:
https://arxiv.org/abs/2205.05986

Thanks
Bill
 
  • #132
PeterDonis said:
The usual answer involves words like "nonlocality", but that's not actually an answer, it's just a restatement of the problem. Nobody has a good answer to how it can be like that.

Aside from the math, of course—that explains it no problem—but 'shut up and calculate' seems to many just a copout (I am one of those many).

As usual, excellent post, Peter—cudos.

I only want to mention I prefer factorizability to locality, as QM is Bell non-local rather than non-local in the usual sense:
https://plato.stanford.edu/entries/bell-theorem/
'The principal condition used to derive Bell inequalities is a condition that may be called Bell locality, or factorizability. It is, roughly, the condition that any correlations between distant events be explicable in local terms, as due to states of affairs at the common source of the particles upon which the experiments are performed.'

I wince at some of my past posts on this, before the distinction penetrated my thick skull (your posts were part of finally understanding this)

Thanks
Bill
 
  • #133
bhobba said:
I prefer factorizability to locality
Note, though, that factorizability is how we would intuitively express the cluster decomposition principle. So this doesn't really help as far as our intuitions are concerned.

In terms of mathematical precision, of course, factorizability wins since "locality" is vague and can be interpreted different ways.
 
  • Love
  • Like
Likes Sambuco and bhobba
  • #134
  • #135
bhobba said:
From what I can see, the article (and its first part--what you linked to is the second part) talks about QBism and consistent histories.

QBism is fine as far as it goes--but it doesn't go very far.

QBism says that the quantum state isn't physically real; it's not a direct representation of the actual, physical state of the system. QBism says that the probabilities in QM are Bayesian--they're descriptions of our state of knowledge about whatever physical system we're trying to make predictions about.

But QBism doesn't say anything about what the actual, physical state of the system is. But that's what most people seem to want from a QM interpretation, QBism doesn't give that.

Consistent histories, from what I can tell, is just restating decoherence theory, and then waffling about whether that amounts to there really being just one history (meaning there's an actual collapse somewhere) or not (meaning no collapse and many worlds).
 
  • #136
PeterDonis said:
n terms of mathematical precision, of course, factorizability wins since "locality" is vague and can be interpreted different ways.

For those following along, you may be wondering how Weinberg accepted the cluster decomposition property, who, without doubt, was aware of Bell.

I can't give the details (page number, etc). Still, if I recall correctly, Weinberg addresses this early on in his justly famous Quantum Theory of Fields (great as a reference but not good for a first exposue - for that, if you are mathematically advanced enough to know some functional analysis, I suggest 'WHAT IS A QUANTUM FIELD THEORY? A First Introduction for Mathematicians by MICHEL TALAGRAND which I am currently studying - just basic HS QM required, believe it or not).

Anyway, here is a recent take on it:
https://arxiv.org/html/2501.12018v1

Thanks
Bill
 
  • #137
PeterDonis said:
Consistent histories, from what I can tell, is just restating decoherence theory, and then waffling about whether that amounts to there really being just one history (meaning there's an actual collapse somewhere) or not (meaning no collapse and many worlds).

For those who want to investigate consistent histories further:
https://quantum.phys.cmu.edu/CHS/histories.html

Griffiths (no, not the Griffiths of the standard EM textbook fame) has kindly made his textbook available for free.

Thanks
Bill
 
  • Like
Likes Sambuco and Morbert
  • #138
bhobba said:
here is a recent take on it
This paper basically seems to be saying that the correlations that break factorizability don't vanish when you look at the correct measurement operators--the ones that describe measurements at the spatially separated locations where they're actually made.
 
  • #139
PeterDonis said:
This paper basically seems to be saying that the correlations that break factorizability don't vanish when you look at the correct measurement operators--the ones that describe measurements at the spatially separated locations where they're actually made.

Yes.

But it says that the further separated they are, the more observations are needed to detect that they cannot be factored. I take cluster decomposition to mean experiments can always be separated far enough that, for all practical purposes, it is true.

'Nevertheless, the larger the spatial separation, the greater the amount of needed experimental data might become in order to make a violation of the Bell’s inequality visible.'

Thanks
Bill
 
  • #140
bhobba said:
it says that the further separated they are, the more observations are needed to detect that they cannot be factored
This would need to be experimentally tested. They don't give any specific numbers, but there are experiments showing Bell inequality violations in measurements at, IIRC, kilometer distances now.
 
  • #141
bhobba said:
I take cluster decomposition to mean experiments can always be separated far enough that, for all practical purposes, it is true.
That isn't the way cluster decomposition is presented in the literature, though. There's no claim that it breaks down for measurements that are spacelike separated, but not far enough.

I'm also not sure that cluster decomposition is really a necessary principle. The really necessary principle, I think, is that spacelike separated measurements have to commute--because their time ordering is not invariant, so it can't matter which one is done first. QFT obeys that principle exactly; Bell inequality violations don't violate it.
 
  • #142
PeterDonis said:
How in tarnation can that happen if the measurements can't influence each other?

The usual answer involves words like "nonlocality", but that's not actually an answer, it's just a restatement of the problem. Nobody has a good answer to how it can be like that. We have a very good answer as to what theoretical framework to use to make predictions--yes, at sufficiently low energies, that's going to be a QFT of one form or another, as the effective field theory approach says. But as for what's going on "under the hood" that makes those QFT predictions work out? Nobody has a good answer.
I once asked about this, but the question wasn't understood. For example, in the case you present, does anything prevent one particle from being in one state and the other in the opposite state by pure chance?

Suppose the above happens every time we measure—that is, every time we measure a particle it has spin down, and when we measure the other particle it always has spin up—can't it just happen by chance, without any intervening interaction? Does something in QM prevent that from happening and there must be an interaction (local or non-local)?
 
Last edited:
  • #143
javisot said:
does anything prevent one particle from being in one state and the other in the opposite state by pure chance?
"Pure chance" wouldn't make it happen every single time.

javisot said:
can't it just happen by chance
No, because "chance" would mean you would get different results on different runs. That's the definition of "chance". If you get opposite results every single time, that's not "chance".
 
  • Like
Likes bhobba and javisot
  • #144
javisot said:
without any intervening interaction?
QM doesn't say there is an "interaction" between the two entangled particles. Certain QM interpretations do, but not all of them.
 
  • #145
PeterDonis said:
"Pure chance" wouldn't make it happen every single time.


No, because "chance" would mean you would get different results on different runs. That's the definition of "chance". If you get opposite results every single time, that's not "chance".
Does any theorem or principle prevent it, or do you intuitively answer that it's not possible?. I also believe it's not possible, but I was wondering if something in the QM formalism prevents it.
 
  • #146
javisot said:
Does any theorem or principle prevent it
The definition of "chance" prevents it. That isn't something specific to QM.
 
  • #147
javisot said:
Does any theorem or principle prevent it, or do you intuitively answer that it's not possible?. I also believe it's not possible, but I was wondering if something in the QM formalism prevents it.
How about Cournot’s principle?

See here for some books, presentations, and articles where it is discussed:
gentzen said:
And also “Scientific Reasoning : The Bayesian Approach” by Colin Howson and Peter Urbach (2006) ...
One other interesting discussion point in that book was that Cournot’s principle is inconcistent (or at least wrong), because in some situation any event which can happen has a very small probability. Glenn Shafer proposes to fix this by replacing “practical certainty” with “prediction”. He may be right. After all, I mostly learned about Cournot’s principle from his Why did Cournot’s principle disappear? and “That’s what all the old guys said.” The many faces of Cournot’s principle. Another possible fix could be to evaluate smallness of probabilities relative to the entropy of the given situations.
 
  • #148
PeterDonis said:
The definition of "chance" prevents it. That isn't something specific to QM.
The definition of chance in the Spanish dictionary is: "A combination of circumstances that cannot be predicted or avoided." The definition doesn't prevent a result from always occurring; it simply dictates that it's something that couldn't be predicted or avoided.

Could you be referring to another definition of chance? A mathematical definition perhaps? (I think your answer is correct; it's not something specific to QM, but rather to our understanding of probability)


So let's say that:

-There is entanglement produced by local interaction

- There is entanglement produced by non-local interaction

-There is no entanglement without local or non-local interaction

Right?
 
  • #149
javisot said:
The definition of chance
In physics is that you are using a model that predicts a random distribution of outcomes. If your model predicts the same outcome every single time, it's not "chance" as far as physics is concerned.

javisot said:
Could you be referring to another definition of chance?
I'm referring to the definition in physics. See above.

javisot said:
So let's say that:

-There is entanglement produced by local interaction

- There is entanglement produced by non-local interaction

-There is no entanglement without local or non-local interaction

Right?
I'm not sure because I'm not sure what you're trying to say with the above.

Particles become entangled by being prepared in an entangled state. The preparation process is local. But after being prepared in an entangled state, the entangled particles can be separated, to arbitrary distances, and measurements on them will still show the correlations predicted by the entangled quantum state. In the particular case of two spin-1/2 particles prepared in the singlet state, measurements of their spin in the same direction will always give opposite results.

That's the physics.
 
  • #150
PeterDonis said:
The definition of "chance" prevents it. That isn't something specific to QM.

Indeed.
https://math.ucr.edu/home/baez/bayes.html

'It turns out that a lot of arguments about the interpretation of quantum theory are at least partially arguments about the meaning of the probability!'

Thanks
Bill
 

Similar threads

Replies
45
Views
7K
Replies
6
Views
2K
  • · Replies 109 ·
4
Replies
109
Views
10K
Replies
52
Views
6K
Replies
35
Views
732
Replies
14
Views
3K
  • · Replies 37 ·
2
Replies
37
Views
6K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 25 ·
Replies
25
Views
5K