B Bell's Inequalities and the issue of non-locality

Click For Summary
Quantum mechanics (QM) is incompatible with Bell's inequalities, as experiments have confirmed that QM predictions can violate these inequalities, indicating that no deterministic hidden-variable theory can replicate QM outcomes. The discussion highlights that while entangled particles exhibit correlations, these cannot be explained by deterministic processes alone, as QM suggests inherent randomness in measurement outcomes. The concept of realism, which posits that all observables have predetermined values, is challenged by QM, which asserts that such values cannot be defined prior to measurement. Theories attempting to maintain locality while adhering to realism fail to account for the observed violations of Bell's inequalities. Ultimately, QM's predictions have been consistently validated by experiments, reinforcing its foundational principles.
  • #31
Nugatory said:
Furthermore these experiments are routinely done with the two measurements spacelike-separated so that not only is the time interval between A’s measurement and B’s measurement not defined, there is no way of saying which came first.
Of course, the experiments are performed with the two measurements spacelike separated, so one cannot speak of the temporal order of the measurements. However, my hypothetical experiment requires a different setting: perform the measurement events timelike separated (while keeping the particles in a coherent state until the measurements, probably a delay of a tiny fraction of a second would suffice).
 
Physics news on Phys.org
  • #32
Tolga T said:
However, my hypothetical experiment requires a different setting: perform the measurement events timelike separated (while keeping the particles in a coherent state until the measurements, probably a delay of a tiny fraction of a second would suffice).
If you mean “a tiny fraction of a second” (as opposed to “5 minutes or 1 day”, which is what you said), then many experiments will qualify.
 
  • #33
Nugatory said:
1. You may be overlooking the relativity of simultaneity here. If the time interval between measurements at the two sites affects whether the inequality is violated, we would find that some observers watching a single run of the experiment will find a violation of the inequality in the results of that run while others looking at the same sequence of measurement results do not. This possibility is difficult to take seriously.

Furthermore these experiments are routinely done with the two measurements spacelike-separated so that not only is the time interval between A’s measurement and B’s measurement not defined, there is no way of saying which came first.

2. But with that said, there are practical limits to how long we can maintain a pair of particles in a coherent state, and hence to the experiments that have actually been done.
Tolga T said:
3. However, my hypothetical experiment requires a different setting: perform the measurement events timelike separated (while keeping the particles in a coherent state until the measurements, probably a delay of a tiny fraction of a second would suffice).

1. For spin/polarization tests on entangled pairs: I don't believe there are any frame (relativistic) considerations for the QM predictions. So I would say the Bell inequalities would be violated for all observers. Do you know of any situations otherwise?2. As an interesting aside: It is hard for me to imagine, but coherent storage of of light has been performed for as long as 1 hour. This was not an entangled pair though.

https://www.nature.com/articles/s41467-021-22706-y 3. Yes, you can certainly perform Bell tests in which temporal order is certain (keeping in mind that it makes no difference to the statistical results whether Alice or Bob measures first).

And another experiment which is mind blowing to me: you can entangle 2 photons that have never existed at the same time. One photon is measured by Alice before the Bob photon is even created. (The entanglement itself is created after Alice's measurement, via entanglement swapping.)

https://arxiv.org/abs/1209.4191
 
  • #34
DrChinese said:
1. For spin/polarization tests on entangled pairs: I don't believe there are any frame (relativistic) considerations for the QM predictions. So I would say the Bell inequalities would be violated for all observers. Do you know of any situations otherwise?
Among the many great achievements by Zeilinger et al. are experiments, where the choice of the measurement setups and registration of two entangled photons were made at space-like separated events (I think in the late 1990ies; I'd have to google to find the corresponding paper(s)).
DrChinese said:
2. As an interesting aside: It is hard for me to imagine, but coherent storage of of light has been performed for as long as 1 hour. This was not an entangled pair though.

https://www.nature.com/articles/s41467-021-22706-yhttps://www.nature.com/articles/s41467-021-22706-y
https://www.nature.com/articles/s41467-021-22706-y
DrChinese said:
3. Yes, you can certainly perform Bell tests in which temporal order is certain (keeping in mind that it makes no difference to the statistical results whether Alice or Bob measures first).

And another experiment which is mind blowing to me: you can entangle 2 photons that have never existed at the same time. One photon is measured by Alice before the Bob photon is even created. (The entanglement itself is created after Alice's measurement, via entanglement swapping.)

https://arxiv.org/abs/1209.4191
All this is only mind blowing, if you think that the found correlations between far-distant parts are due to "spooky actions at a distance" and caused by fluences of the different local measurements on these parts. You have to think in terms of microcausal relativistic QFT to realize that such causal influences are not to be expected. As discussed many times before in this forum, the entanglement swapping is possible, because you use entangled pairs, being prepared before manipulating and measuring the various photons involved in these experiments.

The strong correlations between far-distant entangled parts of systems, for which the measured properties on the single part are even maximally uncertain, is a property of the quantum states, i.e., due to the preparation of the systems in these states and not due to "spooky actions at a distance". Of course, you have to give up the idea that there's more to know about the observables than the probabilistic properties described by Q(F)T.
 
  • #35
Tolga T said:
Of course, the experiments are performed with the two measurements spacelike separated, so one cannot speak of the temporal order of the measurements. However, my hypothetical experiment requires a different setting: perform the measurement events timelike separated (while keeping the particles in a coherent state until the measurements, probably a delay of a tiny fraction of a second would suffice).
The fundamental point you are missing is that there is a limit to anti-correlations in classical probability theory. Everything can't be perfectly anti-correlated with everything else. Suppose you have game for three people. Each person tosses a coin. You can have a perfect correlation, where either a) they all gets heads or b) they all get tails. If one of things always happens that's a perfect correlation.

But, you can't have a perfect anti-correlation. If person A gets a Head and person B gets a Tail, then person C must get the same as one of them. Bell's inequality takes this idea further and looks at the maximal possible anti-correlation in the case of electron spin about different axes. If you disagree with Bell's inequality, then you disagree with classical probability theory. Bell's inequality has nothing directly to do with QM.

To get round Bell's inequality is as impossible as having three people toss a coin and all get something different. Assuming the only possibilities are heads and tails. You can have any stochastic process you like, you can have time delays (or not), you can have coins in a "coherent" state or not.

Now, Bell's inequality is not as elementary as the three-coin game, but it is not difficult to prove mathematically.

If you apply Bell's inequality to electron spin, then you assume "realism" (in some sense), locality (in some sense) and classical probability theory. This is Bell's Theorem. If Bell's Theorem fails experimentally (as it does), then you have to give up either realism, locality or classical probability theory (or all three). There is no way round that. Trying to get round that simply means you have not understood Bell's inequality in the first place.

The failure of Bell's theorem is not a problem for QM, because QM predicted it would fail in the first place. Don't underestimate the significance of this and why people who trust QM can't see the issue that others raise. Everything is as predicted by QM, so where's the problem?

The thing you cannot do is pretend that somehow you can get round Bell's inequality and thereby recover realism, locality, and classical variables. That paradigm is gone.
 
  • Like
Likes vanhees71 and gentzen
  • #36
vanhees71 said:
You have to think in terms of microcausal relativistic QFT to
Hmmm, since when does thinking in those terms actually enlighten anyone? :oldbiggrin:

/Fredrik
 
  • #37
PeroK said:
The thing you cannot do is pretend that somehow you can get round Bell's inequality and thereby recover realism, locality, and classical variables. That paradigm is gone.
I fully agree, the below is just for reflection and thought provocation...
PeroK said:
Assuming the only possibilities are heads and tails.
To keep explaning my point in post #19, the problem with the premises of the theorem is that it assumes that the causality is randomly chosen from what would be the interaction from each "classical" possibility. This is the "ignorance" of the HV, that Bell's theorem assumes and shows does not work. (and it what i keeps reserving to)

But a more evolved(=more fit) strategy could probably employ more advanced action/strategy that actually accounts for all possibilities at once optimizing gain/risk, rather than "randomly" chosing one classical action that would be perfect if unknown HV was known. Nature would likely not settle with anything but the optimal strategy. Such an advanced "action" simply can not be explain in terms of randomly chosen (or averaged) from the simple actions. This is why a "quantum action" can not be reproduced by an average of classical actions.

I think this is conceptually simply put what is going on in bells theorem and it's not hard to understand if you give it some thought. But that in itself I think has little to do with the HV themselves. I think the fallacy is to assume that we can partition a more advanced strategy as a sum of simpler strategies (as per a partitioning of sample space). Ie. the "preparation" that produces the entangled pair, and this is what is going into the interaction, not randomly known states.

/Fredrik
 
  • #38
Fra said:
a more evolved(=more fit) strategy
Is this your own personal theory, or is there a reference?

Please be aware that personal theories are off limits here.
 
  • #39
Fra said:
Hmmm, since when does thinking in those terms actually enlighten anyone? :oldbiggrin:

/Fredrik
If you want to get enlightened you need QED, the best description of light physicists have come up with, and that's a local relativistic QFT!
 
  • #40
PeroK said:
The failure of Bell's theorem is not a problem for QM, because QM predicted it would fail in the first place. Don't underestimate the significance of this and why people who trust QM can't see the issue that others raise. Everything is as predicted by QM, so where's the problem?

The thing you cannot do is pretend that somehow you can get round Bell's inequality and thereby recover realism, locality, and classical variables. That paradigm is gone.
I am not sure everyone is 100% convinced that locality has gone. QM works perfectly experimentally; Bell's inequalities are violated, so at least one of Bell's assumptions must be dropped, and those are just facts. But I think we have every right and reason to suspect that QM is most likely not the final theory. Entanglement and the measurement problem are just two fascinating topics that could lead us into uncharted waters.
 
  • #41
DrChinese said:
And another experiment which is mind blowing to me: you can entangle 2 photons that have never existed at the same time. One photon is measured by Alice before the Bob photon is even created. (The entanglement itself is created after Alice's measurement, via entanglement swapping.)

https://arxiv.org/abs/1209.4191
This experiment is amazing. I was just thinking of an experiment on a circle with points A, S, and B equidistant (and the distance between A and B is less than half the circumference) in the order given. S is the source of preparation for the entangled particles, one of which is sent to A and the other (in the opposite direction) to B. If Bell's equations are violated this time as well, it would imply that one part of the wave function (with the measurement at A) collapses and the other part (of the particle travelling to B) survives a little longer. Or it would mean that particle B knows with certainty before the measurement what will happen in B (in case of measurements at the same axis).
 
  • #42
Tolga T said:
I am not sure everyone is 100% convinced that locality has gone. QM works perfectly experimentally; Bell's inequalities are violated, so at least one of Bell's assumptions must be dropped, and those are just facts. But I think we have every right and reason to suspect that QM is most likely not the final theory. Entanglement and the measurement problem are just two fascinating topics that could lead us into uncharted waters.
It depends on, what you define as "locality". For me, as working in the HEP-nuclear physics community, locality means that relativistic QT is exclusively described by local relativistic QFTs, where locality means that by assumption the microcausality constraint holds, i.e., that local observables commute at space-like separation of their arguments. This holds particularly for the energy density and all other local observables, since energy density is a local observable, and this implies that there are no causal connections between space-like separated (measurement) events, i.e., relativistic QT is realized as a local QFT, and thus locality holds. What, of course, doesn't hold is "reality", i.e., as in any QT, also in relativistic local QFT not all observables necessarily take determined values.
 
  • #43
PeterDonis said:
Is this your own personal theory, or is there a reference?

Please be aware that personal theories are off limits here.
It's not a theory, it is just my understanding/stance on the matter attempting to add perspective in the process of seeking a future better understanding (which is what we all want i think). But the key concepts of quantum vs classical strategies as beeing more or less fit, are not my personal ideas. I didn't have any specific paper in mind when writingn the post but see for example

Quantum Games and Quantum Strategies​

"We investigate the quantization of non-zero sum games. For the particular case of the Prisoners’ Dilemma we show that this game ceases to pose a dilemma if quantum strategies are allowed for. We also construct a particular quantum strategy which always gives reward if played against any classical strategy
...
Summarizing, we have demonstrated that novel features emerge if classical games like the Prisoners’ Dilemma are extended into the quantum domain. We have introduced a correspondence principle which guarantees that the performance of a classical game and its quantum extension can be compared in an unbiased manner. Very much like in quantum cryptography and computation, we have found superior performance of the quantum strategies if entanglement is present"
-- https://arxiv.org/abs/quant-ph/9806088

The idea from above is that two interacting systems are better off (evolutionary perspective) if their strategy is based not on a simple either or, and always gor for the maximum short term benefit but also account for the expected backreaction from the environment, there may be a strategy that is better, it's the insight that you depend also on the environment, that selfpreservation at the evolved level is not just about fighting the environment, but to cooperated with it, as the environment helps stabilize oneself.

The same paper also says, to parry a common critique that "decision" involves humans, which i do not think.

"One might wonder what games and physics could have possibly in common. After all, games like chess or poker seem to heavily rely on bluffing, guessing and other activities of unphysical character. Yet, as was shown by von Neumann and Morgenstern [1], conscious choice is not essential for a theory of games. At the most abstract level, game theory is about numbers that entities are efficiently acting to maximize
or minimize [2]. For a quantum physicist it is then legitimate to ask what happens if linear superpositions of these actions are allowed for, that is if games are generalized into the quantum domain"
-- https://arxiv.org/abs/quant-ph/9806088

John von Neumann himself wrote a book on game theory as well, even though as far as I know he didn't draw the parallells to physical interactions very far. Sometimes mathematics of QM seem to be used for tools elsewhere, rather using the other way around for insight
https://www.amazon.com/dp/0691130612/?tag=pfamazon01-20

/Fredrik
 
  • #44
Tolga T said:
But I think we have every right and reason to suspect that QM is most likely not the final theory.
Quantum mechanics is doing just fine having performed flawlessly since inception. What more can one expect? What matters for a physical theory is to fit into the experiential reality.
 
  • Like
Likes bhobba, PeroK, vanhees71 and 2 others
  • #45
Tolga T said:
I am not sure everyone is 100% convinced that locality has gone.
Not everyone is convinced the Earth is a sphere and not flat. You'll never convince everyone of anything.
 
  • Like
Likes bhobba and DrChinese
  • #46
Fra said:
The idea from above is that two interacting systems are better off (evolutionary perspective) if their strategy is based not on a simple either or, and always gor for the maximum short term benefit but also account for the expected backreaction from the environment
I don't see how this relates to the thread topic. (I'm also not sure your description of what the paper says is correct.) If you want to discuss this paper, you need to start a separate thread in the appropriate forum.
 
  • Like
Likes bhobba and vanhees71
  • #47
PeroK said:
Not everyone is convinced the Earth is a sphere and not flat. You'll never convince everyone of anything.
Very enlightening indeed!

"That one body may act upon another at a distance through a vacuum without the mediation of anything else, by and through which their action and force may be conveyed from one another, is to me so great an absurdity that, I believe, no man who has in philosophic matters a competent faculty of thinking could ever fall into it."

One of the greatest thinkers of mankind could not help writing this down. He was suspicious, and he was so great at it again, because it turned out that his law of gravitation, although more than adequate for most applications, had to be replaced by a new theory after more than 200 years by an insurgent.
 
  • Like
Likes martinbn, gentzen and vanhees71
  • #48
Tolga T said:
Very enlightening indeed!

"That one body may act upon another at a distance through a vacuum without the mediation of anything else, by and through which their action and force may be conveyed from one another, is to me so great an absurdity that, I believe, no man who has in philosophic matters a competent faculty of thinking could ever fall into it."

One of the greatest thinkers of mankind could not help writing this down. He was suspicious, and he was so great at it again, because it turned out that his law of gravitation, although more than adequate for most applications, had to be replaced by a new theory after more than 200 years by an insurgent.
Keep in mind that when it comes to the foundations of quantum mechanics there is more than one meaning to the terms "locality" and "non-locality". I don't think that anyone (aside from crackpots) has any doubt about Bell's theorems, that QM predictions violate Bell's inequality, and that experiments confirm QM predictions. Then it seems that there are people who are convinced about one of the possible interpretations of all that. From everything is local (in some sense of the word) to QM is definitely non-local (in some sense of the word).
 
  • Like
Likes Nugatory, DrChinese and Demystifier
  • #50
Tolga T said:
Very enlightening indeed!

"That one body may act upon another at a distance through a vacuum without the mediation of anything else, by and through which their action and force may be conveyed from one another, is to me so great an absurdity that, I believe, no man who has in philosophic matters a competent faculty of thinking could ever fall into it."

One of the greatest thinkers of mankind could not help writing this down. He was suspicious, and he was so great at it again, because it turned out that his law of gravitation, although more than adequate for most applications, had to be replaced by a new theory after more than 200 years by an insurgent.
Neither of us can claim to know what Newton would have said about QM, had he lived in the late 20th Century. Nor can either of us say whether the failure of Bell's theorem would have changed Einstein's mind. The remaining alternatives to orthodox QM may have offended him more than QM itself.

That said, not accepting a mainstream theory because it might be corrected or replaced 200 years from now, although perhaps a valid philosophical viewpoint, doesn't leave much room for practical progress now.

You've spent much of this thread trying to resurrect local hidden variable theories, which is at least an eccentric point of view and many, including myself, would say is a lost cause.
 
  • Like
Likes vanhees71, Lord Jestocost and DrChinese
  • #51
The problem with quantum theory is that it is not only a major subset of physics, i.e., natural science but also to philosophy and esoterics/religion.

Quantum theory as a physical theory is the most successful desription of Nature we have. There are no principle scientific problems with it, although relativistic quantum field theory is only defined as an effective theory, i.e., in terms of renormalized (and sometimes resummed) perturbation theory, while a rigorous mathematical foundation of interacting QFTs in (1+3) spacetime dimensions is still not achieved. From the physical point of view also a satisfactory quantum formulation of the gravitational interaction is still lacking.
 
  • Like
Likes bhobba, LittleSchwinger, PeroK and 2 others
  • #52
vanhees71 said:
"impossibility to prepare a state where all observables take determined values."

(emphasis mine)

Does SPDC occur naturally (i.e. in Nature)? Have we observed any?

Also, if we use the term attributes rather than observables, would that change our perception of "realism"?
 
Last edited by a moderator:
  • #53
Interested_observer said:
(emphasis mine)

Does SPDC occur naturally (i.e. in Nature)? Have we observed any?

Also, if we use the term attributes rather than observables, would that change our perception of "realism"?
SPDC stands for Spontaneous Parametric Down Conversion. It requires a specially prepared crystal, often made of barium borate (BBo). You often see SPDC abbreviated as simply PDC. I wouldn't call it "naturally occurring" in the usual meaning of that term; it requires a laser to drive it.

Of course, PDC is now a common tool for creating entangled photon pairs for experiments. It is used even in some undergraduate courses. For example, see the following paper which has useful background on PDC:

Entangled photons, nonlocality and Bell inequalities in the undergraduate laboratory
 
  • Like
Likes bhobba, Lord Jestocost and PeroK
  • #54
I'm new around here, so please go kind of easy on me. This quantum uncertainty stuff seems needlessly complicated. We know photons are created from electrons jumping to a lower valence shell, or a lower energy level. And we've had numerous experiments show that measured characteristics of a photon match the characteristics upon emission. This tells me two things - A structure has sacrificed a piece of itself, call it entropy, and that piece, the photon will match the characteristics of the atom from whence it came.

Additionally, in practical terms I equate an electron releasing energy via photon as roughly equal to water becoming steam. Electrons serve as an atoms energy buffer. Photons are essentially disembodied electrons seeking to offload excess energy. And the process works in reverse. That's why we get warm from sunlight.

If this is true, then is it a stretch to say all photons are encoded with set characteristics upon emission, then those characteristics are decoded upon measurement? Maybe indeterminism and probabilistic behavior are only our lack of understanding, not what's actually happening?
 
  • #55
Tec_Pirate said:
We know photons are created from electrons jumping to a lower valence shell, or a lower energy level
That's one way of creating photons, but by no means the only one. It is not what happens, for example, in parametric down conversion.

Tec_Pirate said:
we've had numerous experiments show that measured characteristics of a photon match the characteristics upon emission.
Please give specific references. I suspect you are misinterpreting whatever references you have read, since in many cases we do not even measure "the characteristics upon emission" of photons.

Tec_Pirate said:
This tells me two things - A structure has sacrificed a piece of itself, call it entropy, and that piece, the photon will match the characteristics of the atom from whence it came.

Additionally, in practical terms I equate an electron releasing energy via photon as roughly equal to water becoming steam. Electrons serve as an atoms energy buffer. Photons are essentially disembodied electrons seeking to offload excess energy.
This is personal speculation and is off limits here. You have already received two warnings for personal speculation posts in other threads.

Tec_Pirate said:
the process works in reverse. That's why we get warm from sunlight.
It's true that we get warm from sunlight because our bodies absorb photons from sunlight. But the way that process works is not what you appear to think.

Tec_Pirate said:
s it a stretch to say all photons are encoded with set characteristics upon emission, then those characteristics are decoded upon measurement? Maybe indeterminism and probabilistic behavior are only our lack of understanding, not what's actually happening?
No, this can't be correct; we know that because if what you say were true, we would not observe violations of the Bell inequalities in experiments. But we do.
 
  • #56
I get that PF has rules against speculation, and I’m not trying to push a personal model — just trying to engage honestly with the limits of what we do know.

Quantum mechanics, for all its success, is descriptive, not predictive in the ontological sense. It gives us a statistical framework to guess measurement outcomes, not to explain underlying causes. It doesn’t tell us what a photon is or how an entangled state “knows” what result to yield. It just gives us probabilities that match experiments — and that’s fine, but let’s not confuse a working model with a complete one.

If quantum theory fully explained what’s going on, we wouldn’t still be arguing over interpretations, realism, or whether nonlocality is real or just apparent. That’s not a fringe position — even prominent physicists like Fuchs, Rovelli, or ‘t Hooft have acknowledged deep cracks in our foundational understanding.

My goal isn’t to rewrite QM, but to better understand where its explanatory boundaries are — and maybe why those boundaries exist. If anyone has references that explore this divide — the gap between successful prediction and lack of explanation — I’m all ears.
 
  • #57
Tec_Pirate said:
My goal isn’t to rewrite QM, but to better understand where its explanatory boundaries are — and maybe why those boundaries exist. If anyone has references that explore this divide — the gap between successful prediction and lack of explanation — I’m all ears.
This is way off topic for this thread. It's also way too vague and general for a PF thread. There are already plenty of threads--in the QM interpretations subforum--on this general subject. Please take the time to read them. They include references to the literature on the subject, which is also fairly extensive--and some of which, at least, you already appear to have read. When you have a specific question based on something specific in the literature, then you can start a new thread of your own based on that (and the best place to do that would probably be in the QM interpretations subforum, not this one). Please don't hijack someone else's thread with off topic posts.
 
  • #58
Tec_Pirate said:
I get that PF has rules against speculation, and I’m not trying to push a personal model — just trying to engage honestly with the limits of what we do know.

Quantum mechanics, for all its success, is descriptive, not predictive in the ontological sense. It gives us a statistical framework to guess measurement outcomes, not to explain underlying causes. It doesn’t tell us what a photon is or how an entangled state “knows” what result to yield. It just gives us probabilities that match experiments — and that’s fine, but let’s not confuse a working model with a complete one.

If quantum theory fully explained what’s going on, we wouldn’t still be arguing over interpretations, realism, or whether nonlocality is real or just apparent. That’s not a fringe position — even prominent physicists like Fuchs, Rovelli, or ‘t Hooft have acknowledged deep cracks in our foundational understanding.

My goal isn’t to rewrite QM, but to better understand where its explanatory boundaries are — and maybe why those boundaries exist. If anyone has references that explore this divide — the gap between successful prediction and lack of explanation — I’m all ears.
This thread is not about the “incompleteness” of QM. That is actually an issue belonging in the Interpretations/Foundations sub forum. And in fact there are many existing threads there discussing this issue in “depth” lol.

You didn’t mention the 1935 EPR paper, which is an important starting point for any such discussion. If you are not familiar with it, give it a read.
 
  • #59
Tolga T said:
As for Bell's assumptions, my primary concern is localism, not realism.

Just as a comment here. The two are connected in some sense. The principal condition used to derive Bell inequalities is a condition called Bell locality, or factorisability. It is, roughly, the condition that any correlations between distant events be explicable in local terms.

Factorisability is the better term, but they are often used interchangeably. Even recognised experts like Tim Mauldin, because he fundamentally believes in realism (as I do, actually), get confused and say things that are not strictly true. Bell Locality is NOT the same as locality as it is usually used in physics, ie an object is influenced directly only by its immediate surroundings.

Suppose two particles are entangled. If they were factorisable, then we could consider them as still being particles A and B. However, when entangled, you cannot tell the state of either particle - they are no longer two distinct particles, but a single quantum state; the idea becomes meaningless, at least superficially. What Bell proved is that you can't do that (i.e., consider they are still particles A and B) and have locality. But as I said, basic QM suggests something funny is going on anyway (it is tied up with the principle of superposition). Bell proved it was not locally factorisable, which is the humour of Feynman's famous remark:



If you believe in 'naive' reality, you think it is still two particles when entangled. Some get confused, but the thing Bell showed is QM is not Bell local.

So in QM, what is real? QM is merely an approximation to a more comprehensive theory, Quantum Field Theory (QFT). Most physicists believe the Quantum Field is real - it possesses energy, among other properties- and, as a result of E=MC^2, mass is considered a form of energy. According to the criteria of Victor Stenger (an interesting individual; look him up), reality is what "kicks back." Mass is undoubtedly thought of as real (it certainly kicks back), so Quantum Fields are considered real. Particles are sort of like 'knots' in the Quantum Field. Hence, they are real. But separate? When two particles are entangled, they can be thought of as a sort of single double-knot; each particle has lost its individuality. The 'double knot' is real, but each particle is not. It's possible to have an observation (an interaction) that turns the double knot back into two single knots. The entanglement is broken. An analogy would be a single string that was cut into two strings. Nothing non-local weird in that analogy. Indeed, QFT combines Special Relativity and QM, so it must be local in the ordinary sense.

Thanks
Bill
 
Last edited:
  • #60
Tec_Pirate said:
It doesn’t tell us what a photon is

See, the fact that you wrote this is concerning, since we have QED and it tells us quite clearly what photon is.

Tec_Pirate said:
This quantum uncertainty stuff seems needlessly complicated.

Maybe to someone who does not know the math and relies on pop-sci lay-people descriptions. But if one does not know the math, in my opinion, one does not have any basis to have any opinion on the matter.
 
  • Like
Likes DrChinese, bhobba and PeterDonis

Similar threads

  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
72
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 50 ·
2
Replies
50
Views
7K
Replies
4
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
3K