B Bell's Inequalities and the issue of non-locality

  • #51
The problem with quantum theory is that it is not only a major subset of physics, i.e., natural science but also to philosophy and esoterics/religion.

Quantum theory as a physical theory is the most successful desription of Nature we have. There are no principle scientific problems with it, although relativistic quantum field theory is only defined as an effective theory, i.e., in terms of renormalized (and sometimes resummed) perturbation theory, while a rigorous mathematical foundation of interacting QFTs in (1+3) spacetime dimensions is still not achieved. From the physical point of view also a satisfactory quantum formulation of the gravitational interaction is still lacking.
 
  • Like
Likes bhobba, LittleSchwinger, PeroK and 2 others
Physics news on Phys.org
  • #52
vanhees71 said:
"impossibility to prepare a state where all observables take determined values."

(emphasis mine)

Does SPDC occur naturally (i.e. in Nature)? Have we observed any?

Also, if we use the term attributes rather than observables, would that change our perception of "realism"?
 
Last edited by a moderator:
  • #53
Interested_observer said:
(emphasis mine)

Does SPDC occur naturally (i.e. in Nature)? Have we observed any?

Also, if we use the term attributes rather than observables, would that change our perception of "realism"?
SPDC stands for Spontaneous Parametric Down Conversion. It requires a specially prepared crystal, often made of barium borate (BBo). You often see SPDC abbreviated as simply PDC. I wouldn't call it "naturally occurring" in the usual meaning of that term; it requires a laser to drive it.

Of course, PDC is now a common tool for creating entangled photon pairs for experiments. It is used even in some undergraduate courses. For example, see the following paper which has useful background on PDC:

Entangled photons, nonlocality and Bell inequalities in the undergraduate laboratory
 
  • Like
Likes bhobba, Lord Jestocost and PeroK
  • #54
I'm new around here, so please go kind of easy on me. This quantum uncertainty stuff seems needlessly complicated. We know photons are created from electrons jumping to a lower valence shell, or a lower energy level. And we've had numerous experiments show that measured characteristics of a photon match the characteristics upon emission. This tells me two things - A structure has sacrificed a piece of itself, call it entropy, and that piece, the photon will match the characteristics of the atom from whence it came.

Additionally, in practical terms I equate an electron releasing energy via photon as roughly equal to water becoming steam. Electrons serve as an atoms energy buffer. Photons are essentially disembodied electrons seeking to offload excess energy. And the process works in reverse. That's why we get warm from sunlight.

If this is true, then is it a stretch to say all photons are encoded with set characteristics upon emission, then those characteristics are decoded upon measurement? Maybe indeterminism and probabilistic behavior are only our lack of understanding, not what's actually happening?
 
  • #55
Tec_Pirate said:
We know photons are created from electrons jumping to a lower valence shell, or a lower energy level
That's one way of creating photons, but by no means the only one. It is not what happens, for example, in parametric down conversion.

Tec_Pirate said:
we've had numerous experiments show that measured characteristics of a photon match the characteristics upon emission.
Please give specific references. I suspect you are misinterpreting whatever references you have read, since in many cases we do not even measure "the characteristics upon emission" of photons.

Tec_Pirate said:
This tells me two things - A structure has sacrificed a piece of itself, call it entropy, and that piece, the photon will match the characteristics of the atom from whence it came.

Additionally, in practical terms I equate an electron releasing energy via photon as roughly equal to water becoming steam. Electrons serve as an atoms energy buffer. Photons are essentially disembodied electrons seeking to offload excess energy.
This is personal speculation and is off limits here. You have already received two warnings for personal speculation posts in other threads.

Tec_Pirate said:
the process works in reverse. That's why we get warm from sunlight.
It's true that we get warm from sunlight because our bodies absorb photons from sunlight. But the way that process works is not what you appear to think.

Tec_Pirate said:
s it a stretch to say all photons are encoded with set characteristics upon emission, then those characteristics are decoded upon measurement? Maybe indeterminism and probabilistic behavior are only our lack of understanding, not what's actually happening?
No, this can't be correct; we know that because if what you say were true, we would not observe violations of the Bell inequalities in experiments. But we do.
 
  • #56
I get that PF has rules against speculation, and I’m not trying to push a personal model — just trying to engage honestly with the limits of what we do know.

Quantum mechanics, for all its success, is descriptive, not predictive in the ontological sense. It gives us a statistical framework to guess measurement outcomes, not to explain underlying causes. It doesn’t tell us what a photon is or how an entangled state “knows” what result to yield. It just gives us probabilities that match experiments — and that’s fine, but let’s not confuse a working model with a complete one.

If quantum theory fully explained what’s going on, we wouldn’t still be arguing over interpretations, realism, or whether nonlocality is real or just apparent. That’s not a fringe position — even prominent physicists like Fuchs, Rovelli, or ‘t Hooft have acknowledged deep cracks in our foundational understanding.

My goal isn’t to rewrite QM, but to better understand where its explanatory boundaries are — and maybe why those boundaries exist. If anyone has references that explore this divide — the gap between successful prediction and lack of explanation — I’m all ears.
 
  • #57
Tec_Pirate said:
My goal isn’t to rewrite QM, but to better understand where its explanatory boundaries are — and maybe why those boundaries exist. If anyone has references that explore this divide — the gap between successful prediction and lack of explanation — I’m all ears.
This is way off topic for this thread. It's also way too vague and general for a PF thread. There are already plenty of threads--in the QM interpretations subforum--on this general subject. Please take the time to read them. They include references to the literature on the subject, which is also fairly extensive--and some of which, at least, you already appear to have read. When you have a specific question based on something specific in the literature, then you can start a new thread of your own based on that (and the best place to do that would probably be in the QM interpretations subforum, not this one). Please don't hijack someone else's thread with off topic posts.
 
  • #58
Tec_Pirate said:
I get that PF has rules against speculation, and I’m not trying to push a personal model — just trying to engage honestly with the limits of what we do know.

Quantum mechanics, for all its success, is descriptive, not predictive in the ontological sense. It gives us a statistical framework to guess measurement outcomes, not to explain underlying causes. It doesn’t tell us what a photon is or how an entangled state “knows” what result to yield. It just gives us probabilities that match experiments — and that’s fine, but let’s not confuse a working model with a complete one.

If quantum theory fully explained what’s going on, we wouldn’t still be arguing over interpretations, realism, or whether nonlocality is real or just apparent. That’s not a fringe position — even prominent physicists like Fuchs, Rovelli, or ‘t Hooft have acknowledged deep cracks in our foundational understanding.

My goal isn’t to rewrite QM, but to better understand where its explanatory boundaries are — and maybe why those boundaries exist. If anyone has references that explore this divide — the gap between successful prediction and lack of explanation — I’m all ears.
This thread is not about the “incompleteness” of QM. That is actually an issue belonging in the Interpretations/Foundations sub forum. And in fact there are many existing threads there discussing this issue in “depth” lol.

You didn’t mention the 1935 EPR paper, which is an important starting point for any such discussion. If you are not familiar with it, give it a read.
 
  • #59
Tolga T said:
As for Bell's assumptions, my primary concern is localism, not realism.

Just as a comment here. The two are connected in some sense. The principal condition used to derive Bell inequalities is a condition called Bell locality, or factorisability. It is, roughly, the condition that any correlations between distant events be explicable in local terms.

Factorisability is the better term, but they are often used interchangeably. Even recognised experts like Tim Mauldin, because he fundamentally believes in realism (as I do, actually), get confused and say things that are not strictly true. Bell Locality is NOT the same as locality as it is usually used in physics, ie an object is influenced directly only by its immediate surroundings.

Suppose two particles are entangled. If they were factorisable, then we could consider them as still being particles A and B. However, when entangled, you cannot tell the state of either particle - they are no longer two distinct particles, but a single quantum state; the idea becomes meaningless, at least superficially. What Bell proved is that you can't do that (i.e., consider they are still particles A and B) and have locality. But as I said, basic QM suggests something funny is going on anyway (it is tied up with the principle of superposition). Bell proved it was not locally factorisable, which is the humour of Feynman's famous remark:



If you believe in 'naive' reality, you think it is still two particles when entangled. Some get confused, but the thing Bell showed is QM is not Bell local.

So in QM, what is real? QM is merely an approximation to a more comprehensive theory, Quantum Field Theory (QFT). Most physicists believe the Quantum Field is real - it possesses energy, among other properties- and, as a result of E=MC^2, mass is considered a form of energy. According to the criteria of Victor Stenger (an interesting individual; look him up), reality is what "kicks back." Mass is undoubtedly thought of as real (it certainly kicks back), so Quantum Fields are considered real. Particles are sort of like 'knots' in the Quantum Field. Hence, they are real. But separate? When two particles are entangled, they can be thought of as a sort of single double-knot; each particle has lost its individuality. The 'double knot' is real, but each particle is not. It's possible to have an observation (an interaction) that turns the double knot back into two single knots. The entanglement is broken. An analogy would be a single string that was cut into two strings. Nothing non-local weird in that analogy. Indeed, QFT combines Special Relativity and QM, so it must be local in the ordinary sense.

Thanks
Bill
 
Last edited:
  • #60
Tec_Pirate said:
It doesn’t tell us what a photon is

See, the fact that you wrote this is concerning, since we have QED and it tells us quite clearly what photon is.

Tec_Pirate said:
This quantum uncertainty stuff seems needlessly complicated.

Maybe to someone who does not know the math and relies on pop-sci lay-people descriptions. But if one does not know the math, in my opinion, one does not have any basis to have any opinion on the matter.
 
  • Like
Likes DrChinese, bhobba and PeterDonis
Back
Top