Realism from Locality? Bell's Theorem & Nonlocality in QM

In summary, Bricmont, Goldstein, and Hemmick wrote two papers presenting a Bell-like theorem that involves only perfect correlations and does not involve any inequalities. They claim that this version proves nonlocality and that the theorem cannot be interpreted as a disproof of realism. The authors define realism as non-contextual value-maps, and state that such value-maps cannot exist. Therefore, it is not a choice between locality and realism, as both are incompatible. The authors are Bohmians and accept contextual realism. This view fits with the anti-philosophical attitude, as the minimal interpretation is not complete enough for those inclined towards philosophy. However, it is not new and has been discussed on forums like PF many
  • #141
PeterDonis said:
if measurements at such events have correlations that violate the Bell inequalities, that certainly suggests that there is some kind of causal connection between them.
It makes it plausible but does not certainly suggest it. It may just mean that quantum causality is different from classical causality.
 
Physics news on Phys.org
  • #142
A. Neumaier said:
It makes it plausible but does not certainly suggest it. It may just mean that quantum causality is different from classical causality.
This might be a foreign/native speaker thing, but "certainly suggests" and "makes plausible" are synonymous FAPP.
 
  • #143
Auto-Didact said:
This might be a foreign/native speaker thing, but "certainly suggests" and "makes plausible" are synonymous FAPP.
I'd say that "suggests" and "makes plausible" are synonymous. But there is nothing certain in a plausibility argument. The appropriate wording in the sentence would have been ''suggests to me'',
since plausibility is in the eye of the beholder.
 
  • #144
Demystifier said:
...there is nothing more about a particle than a click in a detector.

Reminds me somehow of the thoughts of Aage Bohr, Ben R. Mottelson and Ole Ulfbeck (The Principle Underlying Quantum Mechanics, Foundations of Physics, Vol. 34, 2004)

A click entirely without a cause, and thus coming by itself, has the novel property of an onset, a beginning from which it develops. The onset, having no precursor, comes as a discontinuity in space and time and is, therefore, unanalyzable.”
 
  • #145
vanhees71 said:
The state (any state, i.e., pure or mixed, all described by statistical operators) describes the statistics about the outcome of measurements of all possible observables.

A pure state allows to predict the “statistics” about the outcome of measurements of observables in question, but the “statistics” itself cannot be predicted by statistical physical models. The motivation behind the desire for an “ensemble interpretation” of quantum probabilities is a yearning for statistical physical models (not always acknowledged), where “classical randomness” is thought to be the cause for the statistics. Bell has shown that this yearning can never be fulfilled. Thus, one should never mix up the "minimal statistical interpretation" with "ensemble interpretations" as this is a fundamental misconception.

vanhees71 said:
...as soon as you accept this minimal probabilistic/ensemble interpretation

Astonishing, that "minimal statistical" reads now "minimal probabilistic".
 
Last edited:
  • Like
Likes Auto-Didact
  • #146
A. Neumaier said:
Then I don't know what he means by my ''own account''.
Again, maybe I am misunderstanding something, but I am just thinking about your TI for spin as in this link

https://www.physicsforums.com/threa...tation-of-quantum-physics.967116/post-6152810
So I am just wondering what could have been the "actual" spin of both entangled particles and how is it then that when MEASUREMENT was done then they perfectly anti correlated.
 
  • #147
PeterDonis said:
No, that's why measurements at spacelike separated events must commute. But "must commute" is not the same as "cannot be causally connected". It means that if such events are causally connected, you can't specify which one comes first. That violates many people's intuition that causally connected events should have a definite ordering, with the "cause" coming before the "effect", but there is nothing in the physics that requires that to be the case. And if measurements at such events have correlations that violate the Bell inequalities, that certainly suggests that there is some kind of causal connection between them.

This is one of many examples where ordinary language is inadequate to describe what our physical theories and the math involved in them actually say.
Hm, in my understanding, causality implies a specific time-ordering. In fact it's the only sense you can give to specific time-ordering, and thus causally connected events cannot be space-like separated. In other words event A can only be the cause of event B if it is time- or light-like separated from B and B is within or on the future light cone of A. It's not clear to me, how you define causality to begin with.

That said, the very point is that due to microcausality there is no cause-effect relation between space-like separated measurement events (like clicks of photo detectors in the here discussed experiments with entangled photons). The correlations are not caused by the measurements but are due to the correlation following from the preparation in an entangled state.

This doesn't necessarily imply that entanglement can only occur when the entangled parts of a system are causally connected, as the example of entanglement swapping show, as in the elsewhere discussed PRL 88, 017903 (2002) by Jennewein et al. In this case the entanglement is due to selection (or even post-selection!) of a subensemble out of an before (in the correct relativistic sense!) created system of two entangled (but not among them entangled) photon pairs. Note however that each of these pairs have been created in an entangled state by causal local interaction (SPDC of a laser photon in a BBO crystal).
 
  • #148
Lord Jestocost said:
A pure state allows to predict the “statistics” about the outcome of measurements of observables in question, but the “statistics” itself cannot be predicted by statistical physical models. The motivation behind the desire for an “ensemble interpretation” of quantum probabilities is a yearning for statistical physical models (not always acknowledged), where “classical randomness” is thought to be the cause for the statistics. Bell has shown that this yearning can never be fulfilled. Thus, one should never mix up the "minimal statistical interpretation" with "ensemble interpretations" as this is a fundamental misconception.
Astonishing, that "minimal statistical" reads now "minimal probabilistic".
Oh come on, statistical and probablistic is really synonymous if it comes to the application of probability theory to real-world problems, and QT is also a kind of probability theory.

Also the assignment of a mixed state to a situation (formally said a "preparation procedure") leads to the prediction of probabilities. E.g., the equilibrium state of a thermalized gas leads to the prediction of the velocity distribution of particles. BTW the corresponding Maxwell-Boltzmann distribution has been empirically violated by Otto Stern in Frankfurt in the 1920ies for the first time.

I always thought that "minimal statistical" and "ensemble" interpretations are just the name of the same interpretation. If not, what's the difference. Is this again one of these unnecessary confusions due to (unnecessary?) philosophical sophistry?
 
  • #149
vanhees71, aren't you shooting superposition between the eyes.:smile:
 
  • #150
ftr said:
So I am just wondering what could have been the "actual" spin of both entangled particles and how is it then that when MEASUREMENT was done then they perfectly anti correlated.
Good question. According to the usual (minimal) interpretation an "actual spin" (to be more precise an "actual spin component") either has a determined value or not due to the preparation of the corresponding system.

You cannot in all generality say in which state a system might be after a measurement. Maybe it's even destroyed (like photons hitting a photon detector). This depends on the specific measurement apparatus.

For the here obviously discussed example of the Stern-Gerlach experiment. The spin component given by the direction of the magnetic field is prepared in an (almost exact) pure state ##|\text{up} \rangle## or ##|\text{down} \rangle## due to the strict entanglement between position and this spin component after the silver atom has run through the magnet. This is the preparation process: You have a (utmost) completely determined value of the spin component just by looking at particles at the corresponding positions. That's the paradigmatic example for a preparation through (an almost ideal) von Neumann filter procedure (I don't like to call it measurement).

Of course, as soon as you measure the particle's position by letting it hit a photo plate or a CCD cam, it's gone. You cannot easily describe its (spin) state after that, but that's a trivial thought, I'd say.
 
  • #151
ftr said:
vanhees71, aren't you shooting superposition between the eyes.:smile:
?
 
  • #152
vanhees71 said:
?
Ok I will elaborate ( I guess I am summarizing while multitasking my life:smile: ). I thought the whole enigma of the entangled particles is that the spin(whatever) are in superposition before the measurement. I have heard about the "socks" model, but it seems so unconvincing, also the conservation momentum camp ...etc.
 
  • #153
vanhees71 said:
I always thought that "minimal statistical" and "ensemble" interpretations are just the name of the same interpretation. If not, what's the difference. Is this again one of these unnecessary confusions due to (unnecessary?) philosophical sophistry?

Hilary Putnam in “Philosophical Papers: Volume 1, Mathematics, Matter and Method”, Second Edition, 1979, p. 147:
“To put it another way, it is a part of quantum mechanics itself as it stands today that the proper interpretation of the wave is statistical in this sense: the square amplitude of the wave is the probability that the particle will be found in the appropriate place if a measurement is made (and analogously for representations other than position representation). We might call this much the minimal statistical interpretation of quantum mechanics, and what I am saying is that the minimal statistical interpretation is a contribution of the great founders of the CI— Bohr and Heisenberg, building, in the way we have seen, on the earlier idea of Born — and a part of quantum mechanical theory itself. However, the minimal statistical interpretation is much less daring than the full CI. It leaves it completely open whether there are any observables for which the principle ND is correct, and whether or not hidden variables exist. The full CI, to put it another way, is the minimal statistical interpretation plus the statement that hidden variables do not exist and that the wave representation gives a complete description of the physical system.”
[CI means “Copenhagen Interpretation”, italic in original, principle ND: see footnote **, LJ] [bold by LJ]

** Hilary Putnam in “Philosophical Papers: Volume 1, Mathematics, Matter and Method”, Second Edition, 1979, p. 140:
“Principle ND says that an observable has the same value (approximately) just before the measurement as is obtained by the measurement; the CI denies that an observable has any value before the measurement.”

vanhees71 said:
Oh come on, statistical and probablistic is really synonymous if it comes to the application of probability theory

When reasoning about random outcomes of measurements, one can now question how this randomess emerges: In a statistical way (“classical randomness”, that's what the ensemble interpretation is yearning for) or in a probabilistic way (“quantum randomness” ).

Richard D. Gill in “Statistics, Causality and Bell’s Theorem”:
“In classical physics, randomness is merely the result of dependence on uncontrollable initial conditions. Variation in those conditions, or uncertainty about them, leads to variation, or uncertainty, in the final result. However, there is no such explanation for quantum randomness. Quantum randomness is intrinsic, nonclassical, irreducible. It is not an emergent phenomenon. It is the bottom line. It is a fundamental feature of the fabric of reality.”
[italic in original, LJ]
 
Last edited:
  • Like
Likes Auto-Didact
  • #154
vanhees71 said:
The correlations are not caused by the measurements but are due to the correlation following from the preparation in an entangled state.

You make it sound like entanglement is some rare artificially induced thing, and therefore non-locality ("a-causality" is a term I have heard applied) between entangled elements just an exotic oddity - philosophically curious.

But I've always been confused about where entanglement is naturally found. Is it natural and ubiquitous in addition to being an isolated laboratory phenomenon? I mean isn't it natural and ubiquitous?

Curious if this paper is credible. To my mind, entanglement must be ubiquitous, uniform even, the Cauchy surface of causality everywhere. What machinery is there anywhere in nature that is not microscopically (i.e. fundamentally) evolving according to the phenomena of QM and/or QFT?

Except maybe stuff in the... distant past? Even with that I keep wondering, how far back? If the pilot wave, or QFT wave function is here (now) and extends out over some space-time region where else did, does, do, will it go? how far? why that far? and to what, and why that?

https://arxiv.org/abs/1106.2264v3

Entanglement thresholds for random induced states
Guillaume Aubrun, Stanislaw J. Szarek, Deping Ye
(Submitted on 11 Jun 2011 (v1), last revised 15 Oct 2012 (this version, v3))
For a random quantum state on H=Cd⊗Cd obtained by partial tracing a random pure state on H⊗Cs, we consider the whether it is typically separable or typically entangled. For this problem, we show the existence of a sharp threshold s0=s0(d) of order roughly d3. More precisely, for any a>0 and for d large enough, such a random state is entangled with very large probability when s<(1−a)s0, and separable with very large probability when s>(1+a)s0. One consequence of this result is as follows: for a system of N identical particles in a random pure state, there is a threshold k0=k0(N)∼N/5 such that two subsystems of k particles each typically share entanglement if k>k0, and typically do not share entanglement if k<k0. Our methods work also for multipartite systems and for "unbalanced" systems such as Cd⊗Cd′, d≠d′. The arguments rely on random matrices, classical convexity, high-dimensional probability and geometry of Banach spaces; some of the auxiliary results may be of reference value. A high-level non-technical overview of the results of this paper and of a related article arXiv:1011.0275 can be found in arXiv:1112.4582.
 
  • #155
Demystifier said:
Its pretty tough only if one does not accept that there is nothing more about a particle than a click in a detector.
Nothing more about a particle than those who are brought to our consciousness. We human beings, let us be aware of the physical phenomena through our senses: f(r,t), g(sound, t), ...

Demystifier said:
And guess what, many experts in the field do not accept it.
it's not surprising: QBism and the Greeks

/Patrick
 
  • #156
A. Neumaier said:
I'd say that "suggests" and "makes plausible" are synonymous. But there is nothing certain in a plausibility argument. The appropriate wording in the sentence would have been ''suggests to me'',
since plausibility is in the eye of the beholder.
As I said, this definitely seems to be a foreign language speaker issue: the confusion arises from the stem 'certain-'; the word 'certainly' in this context has nothing directly to do with 'certainty', i.e. "certainly suggests" means "it is true that it suggests", which clearly is in contrast to "suggests (with) certainty", which means "suggests that it is true".

The statements "it is true that it suggests" and "suggests that it is true" are converse to each other, i.e. certainly not synonymous (pun intended). I know, this is literally arguing semantics and linguistics, but with you being a mathematician with a strong urge for absolute precision in reasoning, I suspect that you (secretly) enjoy such subtleties.
vanhees71 said:
Oh come on, statistical and probablistic is really synonymous if it comes to the application of probability theory to real-world problems, and QT is also a kind of probability theory.
Statistical and probabilistic are not synonymous. Statistics is an empirical methodology based largely on (certain forms of) probability theory, while probability theory is a field in mathematics such as geometry.

In the simplest cases of applications of probability theory in the form of mathematical models - i.e. how statistics is mostly used in the practice of physics - the two cases tend to be the same, but this is purely contingent upon the simplicity of the phenomena of physics and their idealizeable nature, strongly contrasting the mathematical descriptions of other phenomena studied by the other sciences.
vanhees71 said:
I always thought that "minimal statistical" and "ensemble" interpretations are just the name of the same interpretation. If not, what's the difference. Is this again one of these unnecessary confusions due to (unnecessary?) philosophical sophistry?
This is not confusion due to philosophical sophistry, but more confusion due to a wrongly assumed equivalence relation between two different classes/sets: the relation between the elements of the set of statistics and the elements of the set of applications of probability theory is not bijective; the latter set is far larger than the former and moreover, the former set has relations with other formal domains as well, e.g. logic.

In other words, statistics (the textbook subject) based on probability theory in fact is based on a very small subset of probability theory, while the rest of probability theory specifically does not feature in it. Applications coming from the rest of probability theory which have the same form and intended use as statistics are models of non-standard statistics, usually invented and studied by mathematical statisticians.

In practice, applied statisticians and non-physical scientists do not acknowledge such non-standard models as statistics, but see them more as alternate theories, such as all alternative speculative theories without any experimental validation from theoretical physics (e.g. string theory) are seen by most physicists. This is also similar to how physicists view falsified physical theories from the history of physics, i.e. as "theories made and used by physicists historically which are today not anymore part of physics".

To conclude, this is partly because probability theory is a far bigger subject than just the textbook subject, in exactly the same way that mechanics is in actuality a far bigger subject than 'just classical mechanics' and also a far bigger subject than 'classical mechanics plus quantum mechanics'.

The pragmatic restriction of mechanics to mean 'only CM and QM' is actually not named mechanics, but canonical mechanics; but most physicists don't use or respect this terminology anymore because they lack the adequate philosophical training, making them terrible at the reasoning required for foundational research in stark contrast to mathematicians, logicians and philosophers.
 
Last edited:
  • #157
Jimster41 said:
You make it sound like entanglement is some rare artificially induced thing, and therefore non-locality ("a-causality" is a term I have heard applied) between entangled elements just an exotic oddity - philosophically curious.

But I've always been confused about where entanglement is naturally found. Is it natural and ubiquitous in addition to being an isolated laboratory phenomenon? I mean isn't it natural and ubiquitous?
Your suspicions are of course warranted: entanglement is ubiquitous, almost all quantum states in Nature are entangled states, but decoherence of course breaks these entanglements, which of course is why building a quantum computer is such an engineering challenge.

But to make the argument even stronger, in textbook QM the description of ##\psi## is non-local whether or not entanglement is involved, i.e. even for a single particle wavefunction non-locality is already present in the following example given by Penrose about a decade ago or earlier:

Imagine a photon source and a screen some distance away and single photons are detected (or measured) as single points on the screen; in between source and screen is where the wavefunction is. Now imagine that there is a detector at each point of the screen; once the photon is detected at a single point on the screen we can call it a detection event.

Each single detection event on the screen instantaneously prohibits the photon from being seen anywhere else on the screen i.e. once a detection event takes place by a single detector, all other detectors are effectively instantaneously prohibited from detecting the photon; if the detector had to communicate this detection event to all the other detectors it would need to convey that information faster than light.

In other words, detection i.e. measurement itself breaks the non-locality of the wavefunction; this can be mathematically described in detail as measurements effectively removing the first cohomology element of single photon wavefunctions (NB: these photon wavefunctions are usually Fourier transformed wavefunctions, and together with their Fourier transforms reside in a larger abstract complex analytic mathematical space).
 
  • Like
Likes Jimster41
  • #158
vanhees71 said:
event A can only be the cause of event B if it is time- or light-like separated from B and B is within or on the future light cone of A
but wouldn't that make it possible to have a frame in which the two events occurs simultaneously? even though the two frames can't be Lorentz connected?
 
  • #159
Demystifier said:
They can accept its truth, but not its completeness. They want to know what happens behind the curtain.

On the other hand, those who are satisfied with the purely epistemic interpretation either
(i) don't care about things behind the curtain, or
(ii) care a little bit but don't think that it is a scientific question, or
(iii) claim that there is nothing behind the curtain at all.
Those in the category (i) have a mind of an engineer, which would be OK if they didn't claim that they are not engineers but scientists. Those in the category (ii) often hold double standards because in other matters (unrelated to quantum foundations) they often think that questions about things behind the curtain are scientific. Those in the category (iii) are simply dogmatic, which contradicts the very essence of scientific way of thinking.

I believe there is nothing behind the curtain, not because it's irrelevant, but because there really is no thing behind the curtain and there is no dynamical story to tell. QM is providing an adynamical constraint on the distribution of momentum exchange in spacetime without a dynamical counterpart. How is that unscientific? Must all scientific explanation be dynamical?
 
  • Informative
Likes Auto-Didact
  • #160
RUTA said:
I believe there is nothing behind the curtain, not because it's irrelevant, but because there really is no thing behind the curtain and there is no dynamical story to tell. QM is providing an adynamical constraint on the distribution of momentum exchange in spacetime without a dynamical counterpart. How is that unscientific? Must all scientific explanation be dynamical?
All other scientific accounts of phenomena ever given so far have ultimately turned out to be dynamical. Because the question is still mathematically wide open, i.e. the correct mathematical description to fully describe the problem has not yet been found or proven to not exist, the question is currently an open question in mathematical physics.

The very fact that Bohmian mechanics even exists at all and may even be made relativistic, is proof that this is a distinctly scientific open problem of theoretical physics within the foundations of QM, let alone the existence of alternate theories waiting to be falsified experimentally and the existence of the open problem of quantum gravity.

It is therefore by all accounts vehemently shortsighted and extremely premature to decide based upon our best experimental knowledge that a dynamical account is a priori impossible; the experimental knowledge itself literally indicates no such thing, instead this is a cognitive bias coming from direct extrapolation of our effective models to arbitrary precision.
 
  • #161
Auto-Didact said:
All other scientific accounts of phenomena ever given so far have ultimately turned out to be dynamical. Because the question is still mathematically wide open, i.e. the correct mathematical description to fully describe the problem has not yet been found or proven to not exist, the question is currently an open question in mathematical physics.

The very fact that Bohmian mechanics even exists at all and may even be made relativistic, is proof that this is a distinctly scientific open problem of theoretical physics within the foundations of QM, let alone the existence of alternate theories waiting to be falsified experimentally and the existence of the open problem of quantum gravity.

It is therefore by all accounts vehemently shortsighted and extremely premature to decide based upon our best experimental knowledge that a dynamical account is a priori impossible; the experimental knowledge itself literally indicates no such thing, instead this is a cognitive bias coming from direct extrapolation of our effective models to arbitrary precision.

If constraint-based, adynamical explanation only resolved the mysteries of QM, then it might be crazy to consider it. But, it also does so for GR, as we show in our book (Beyond the Dynamical Universe). I think this is precisely why “Einstein’s double revolution” remains unfinished (Smolin’s lingo). Modern physics is complete (minus QG) and self-consistent, it’s us physicists who haven’t realized the Kuhnian revolution for what it is, i.e., “ascending from the ant’s-eye view to the God’s-eye view of physical reality is the most profound challenge for fundamental physics in the next 100 years” (Wilczek).
 
  • Like
Likes Auto-Didact
  • #162
vanhees71 said:
in my understanding, causality implies a specific time-ordering

Yes, but that's a statement about your preferred use of ordinary language, not about physics. We all agree on the physics: we all agree that spacelike separated measurements commute and that such measurements on entangled quantum systems can produce results that violate the Bell inequalities. It would be nice if the discussion could just stop there, but everyone insists on dragging in vague ordinary language terms like "causality" and "locality" and arguing about whether they are appropriate terms to use in describing the physics that we all agree on.

vanhees71 said:
In other words event A can only be the cause of event B if it is time- or light-like separated from B and B is within or on the future light cone of A.

Where does this requirement show up in QFT? QFT is time symmetric.

vanhees71 said:
It's not clear to me, how you define causality to begin with.

The same way you've been defining it (you sometimes use the term "microcausality", but sometimes not): that spacelike separated measurements commute. But now you seem to be shifting your ground and giving a different definition of "causality" (the one I quoted above), the basis of which in QFT I don't understand (which is why I asked about it).

vanhees71 said:
due to microcausality there is no cause-effect relation between space-like separated measurement events

So microcausality means no causal relationship? That seems like an odd use of language.

vanhees71 said:
In this case the entanglement is due to selection (or even post-selection!) of a subensemble out of an before (in the correct relativistic sense!) created system of two entangled (but not among them entangled) photon pairs. Note however that each of these pairs have been created in an entangled state by causal local interaction (SPDC of a laser photon in a BBO crystal).

The way I would describe all this is not that entangled pairs do not have to be causally connected. The way I would describe it is that QFT, strictly speaking, does not admit the concept of "an entangled pair", because it does not admit the concept of the "state" of an extended system at an instant of "time". It only admits measurement events and correlations between them, and it predicts the statistics of such correlations using quantum field operators that obey certain commutation relations. Each individual such operator is tied to a specific single event in spacetime: that's what makes it "local". Any talk about "entangled systems" measured at spacelike separated events is just an approximation and breaks down when you try to look too closely.

Again, this is all about how to describe things in ordinary language, not about physics. Basically many people do not like the extreme viewpoint I just described, which is IMO the proper consistent way to describe what QFT is saying. Many people do not want to give up the notion of "entangled systems" containing multiple spatially separated particles. But that notion IMO is a holdover from non-relativistic physics and needs to be given up in a proper account of what QFT says, if we are going to talk about how best to describe the physics in ordinary language and we aren't willing just to stop at the point of describing the physics in its most basic terms (which I gave at the start of this post).
 
  • Like
Likes Mentz114 and Auto-Didact
  • #163
kent davidge said:
wouldn't that make it possible to have a frame in which the two events occurs simultaneously?

No. If two events are timelike or null (lightlike) separated, there is no frame in which they are simultaneous.

kent davidge said:
even though the two frames can't be Lorentz connected?

I have no idea what you mean by this.
 
  • #164
RUTA said:
I believe there is nothing behind the curtain, not because it's irrelevant, but because there really is no thing behind the curtain and there is no dynamical story to tell. QM is providing an adynamical constraint on the distribution of momentum exchange in spacetime without a dynamical counterpart. How is that unscientific? Must all scientific explanation be dynamical?
There are more than one point on which your position is unscientific.
First, you believe in one true explanation. Just because you have an explanation that fits observations does not mean that there can't be other explanations.
Second, the process of gaining scientific knowledge is ... well a process, a dynamical story as you call it. What is the point of denying value of dynamical approach and then seeking justification for that from perspective of dynamical approach. It's stolen concept fallacy.
So answering your question: "Must all scientific explanations be dynamical?" - yes, all scientific explanations must be dynamical because only testable explanations are scientific and the process of testing is dynamical, you have initial conditions and then you observe what happens and if your observations agree with predictions.
 
  • #165
RUTA said:
I believe there is nothing behind the curtain, not because it's irrelevant, but because there really is no thing behind the curtain and there is no dynamical story to tell. QM is providing an adynamical constraint on the distribution of momentum exchange in spacetime without a dynamical counterpart. How is that unscientific? Must all scientific explanation be dynamical?
It's scientific to say: Maybe there is nothing behind the curtain, it seems very likely to me that it is so.
But it's not scientific to say: There is nothing behind the curtain, period.
 
  • Like
Likes Auto-Didact and Lord Jestocost
  • #166
Jimster41 said:
You make it sound like entanglement is some rare artificially induced thing, and therefore non-locality ("a-causality" is a term I have heard applied) between entangled elements just an exotic oddity - philosophically curious.
Where did I say this? Entanglement is the rule rather than the exception. Alone from the fact that we have indistinguishable particles and thus Bose or Fermi symmetric/anti-symmetric Fock spaces leads to a lot of entanglement.
 
  • #167
kent davidge said:
but wouldn't that make it possible to have a frame in which the two events occurs simultaneously? even though the two frames can't be Lorentz connected?
No, for time- or lightlike events the time ordering is the same in any frame (of course in SRT; in GR it's more complicated and it holds only in a local sense).
 
  • #168
PeterDonis said:
Yes, but that's a statement about your preferred use of ordinary language, not about physics. We all agree on the physics: we all agree that spacelike separated measurements commute and that such measurements on entangled quantum systems can produce results that violate the Bell inequalities. It would be nice if the discussion could just stop there, but everyone insists on dragging in vague ordinary language terms like "causality" and "locality" and arguing about whether they are appropriate terms to use in describing the physics that we all agree on.
Causality is not vague but a fundamental assumption underlying all physics. Locality is another case since there a lot of confusion arises from the fact that too often people don't distinguish between causal effects and (predetermined) correlations. This becomes particularly problematic when it comes to long-range correlations between entangled parts of a quantum system.
Where does this requirement show up in QFT? QFT is time symmetric.
Indeed, ignoring weak interactions the Standard Model is T-invariant. Nevertheless the S-matrix provides a time ordering. You define an initial state (usually two asymptotic free particles) and then look for the transition probability rate to a given final state. This reflects how we can do experiments, and there's always this time ordering: Preparation of a state and then measuring something. T invariance then just means that (at least in principle) the "time-reversed process" is also possible and leads to the same S-matrix elements.
The same way you've been defining it (you sometimes use the term "microcausality", but sometimes not): that spacelike separated measurements commute. But now you seem to be shifting your ground and giving a different definition of "causality" (the one I quoted above), the basis of which in QFT I don't understand (which is why I asked about it).
I don't understand, what's unclear about this. To motivate the microcausality constraint, which is usually also called locality of interactions, you need the causality principle, and in Q(F)T it's even a weak one, i.e., you need to know only the state at one point in time to know it, given the full dynamics or Hamiltonian of the system, to any later point in time, i.e., it's causality local in time. You don't need to know the entire history before one "initial point" in time.

So microcausality means no causal relationship? That seems like an odd use of language.
Microcausality ensures that there are no faster-than-light causal connections. Given the general causality assumption that's a necessary consequence: Two space-like separated events do not define a specific time order and thus one event cannot be the cause of the other.

The way I would describe all this is not that entangled pairs do not have to be causally connected. The way I would describe it is that QFT, strictly speaking, does not admit the concept of "an entangled pair", because it does not admit the concept of the "state" of an extended system at an instant of "time". It only admits measurement events and correlations between them, and it predicts the statistics of such correlations using quantum field operators that obey certain commutation relations. Each individual such operator is tied to a specific single event in spacetime: that's what makes it "local". Any talk about "entangled systems" measured at spacelike separated events is just an approximation and breaks down when you try to look too closely.
Of course QFT admits entangled states. We write them down all the time discussing about photons. Measurements are just usual interactions between entities described by the fields, and due to microcausality they are local, i.e., there cannot be any causal influence of one measurment event on another measurement event that is space-like separated. I.e., if A's detector clicks this measurement event can not be the cause of anything outside of the future light cone of this event.
Again, this is all about how to describe things in ordinary language, not about physics. Basically many people do not like the extreme viewpoint I just described, which is IMO the proper consistent way to describe what QFT is saying. Many people do not want to give up the notion of "entangled systems" containing multiple spatially separated particles. But that notion IMO is a holdover from non-relativistic physics and needs to be given up in a proper account of what QFT says, if we are going to talk about how best to describe the physics in ordinary language and we aren't willing just to stop at the point of describing the physics in its most basic terms (which I gave at the start of this post).
Of course, there are entangled states and there are the correspondingly observed strong correlations between far distant measurements, and all that is describable by relativistic QFT. It's also clear that the localizability also of massive particles, which have a position observable, is much more constrained in relativstic QFT than in non-relativistic QM since rather than localizing a particle better and better by "squeezing" it somehow in an ever smaller region in space you tend to create new particles.

Note that in general the location of an entity described by QFT is determined by the location of a measurement device with a finite spatial resolution; it's not necessary that the measured systems have position observables, as the example of photons shows: All observable there is is that a detector located in some spatial region registers a photon or not.

If QFT couldn't describe the observed entanglement between, e.g., photons and the corresponding violation of Bell and other related inequalities, it wouldn't be complete. A theory must describe all known observational facts, and entanglement is obviously an observational fact.
 
  • Like
Likes akvadrako
  • #169
zonde said:
So answering your question: "Must all scientific explanations be dynamical?" - yes, all scientific explanations must be dynamical because only testable explanations are scientific and the process of testing is dynamical, you have initial conditions and then you observe what happens and if your observations agree with predictions.
Of course, QT provides a description of the dynamics of the system and the measurable quantities related with it. That's what QT is all about. I may be buried in many introductory courses, because students tend to get the impression that rather all there is are stationary states (i.e., eigenstates of the Hamiltonian), but that's only "statics" in a sense. Also in hydrodynamics or classical electrodynamics you can stick with static or stationary special cases, but still hydro as well as Maxwell theory are indeed descriptions of the dynamics of the described system (fluids and charges and the em. field, respectively).
 
  • Like
Likes zonde
  • #170
vanhees71 said:
Note that in general the location of an entity described by QFT is determined by the location of a measurement device with a finite spatial resolution; it's not necessary that the measured systems have position observables, as the example of photons shows: All observable there is is that a detector located in some spatial region registers a photon or not.

A photon for instance. In a visible -UV spectroscopy. A photon is an object that has definite frequency V and definite energy hv. However, Its size and position are unknown or undefined even if it is absorbed and emitted by a molecule. OTOH, a photon in a quantum optics experimenter, detection correlation studies,; it has no definite frequency, but has somewhat defined position and size, looks localized particle when it gets detected in a light detector. The high energy experimenter talks about is a small particle that is not possible to see in photos of the particle tracks and their scattering events, but makes it easy to explain the curvature of tracks of matter particles with common point of origin within the framework of energy and momentum conservation (e. g. appearance of pair of oppositely charged particles, or the Compton scattering). This photon has usually definite momentum and energy (hence also definite frequency), and fairly definite position, since it participates in fairly localized scattering events. At the end of the day, the only common denominator is a mathematical description of EM field and its interaction with or some version of fock states, countable things. One measurable dynamic of reality is Time passes at different rates from place to place. Locality is always an approximation in the dynamical sense. Somewhat frozen image/depiction/description/detection of things that is always formless dynamic in nature --unless it interacts. That is exactly the view of Rovelli.

Scientific Realism



https://arxiv.org/pdf/1508.05533.pdf
"The observed dynamics of the world is time-reversal invariant: a given un-oriented sequence of quantum events does not determine a time arrow. Charging the wave function, (more in general, the quantum state) with a 4 realistic ontological interpretation, leads to a picture of the world where this invariance is broken. It is not broken in the sense that the full theory breaks T-reversal invariance (it does not), but in the sense that the wave function we associate to observed events depends on a choice of orientation of time. "
 
  • Like
Likes *now*
  • #171
julcab12 said:
Somewhat frozen image/depiction/description/detection of things that is always formless dynamic in nature --unless it interacts. That is exactly the view of Rovelli.
I think it doesn't make sense. To see why, suppose that the universe contains only a hydrogen atom and nothing else. In the hydrogen atom, the electron interacts with the proton. Does it create the "form" of the electron?
 
  • #172
I think the word "causality", or the phrase "A causes B", is informal, heuristic language.

If there were something called "free will", we could interpret that if I choose to do A, then B necessarily happens. But no one in the past 2,500 years has been able to give a sensible definition of free will.

A more exact phrase is that "from A and other initial conditions we can calculate B and other end results". It is a mathematical problem, and in mathematics we do not use the word "causes".

There was recently a thread where the word "cause" is central:
https://www.physicsforums.com/threads/quantum-interpretations-of-this-optical-effect.974340/

In the above paper, Aharonov et al. claim that photons detected at D2 mystically "cause" a mirror to be pushed to the left. I think they calculated a correlation: observation of photons at D2 correlates with an observed force to the left on the mirror.
 
  • #173
Demystifier said:
I think it doesn't make sense. To see why, suppose that the universe contains only a hydrogen atom and nothing else. In the hydrogen atom, the electron interacts with the proton. Does it create the "form" of the electron?
It doesn't say that. Well according to him. locality of quantum mechanics is by postulating relativity to the observer for events and facts, instead of an absolute “view from nowhere”. The main ontology of “observers”, measurement interactions and relative events. And, it doesn't say any form or becomes meaningless otherwise. Besides the only way to detect/picture a electron(seen as local) is through electron interacting.
 
  • #174
Auto-Didact said:
Your suspicions are of course warranted: entanglement is ubiquitous, almost all quantum states in Nature are entangled states, but decoherence of course breaks these entanglements, which of course is why building a quantum computer is such an engineering challenge.

But to make the argument even stronger, in textbook QM the description of ##\psi## is non-local whether or not entanglement is involved, i.e. even for a single particle wavefunction non-locality is already present in the following example given by Penrose about a decade ago or earlier:

Imagine a photon source and a screen some distance away and single photons are detected (or measured) as single points on the screen; in between source and screen is where the wavefunction is. Now imagine that there is a detector at each point of the screen; once the photon is detected at a single point on the screen we can call it a detection event.

Each single detection event on the screen instantaneously prohibits the photon from being seen anywhere else on the screen i.e. once a detection event takes place by a single detector, all other detectors are effectively instantaneously prohibited from detecting the photon; if the detector had to communicate this detection event to all the other detectors it would need to convey that information faster than light.

In other words, detection i.e. measurement itself breaks the non-locality of the wavefunction; this can be mathematically described in detail as measurements effectively removing the first cohomology element of single photon wavefunctions (NB: these photon wavefunctions are usually Fourier transformed wavefunctions, and together with their Fourier transforms reside in a larger abstract complex analytic mathematical space).
I’m familiar with the two slit experiment etc. But to me it seems just as sensible to interpret the vave-like interference pattern seen by the detector(s) as “confirming” or “realizing” the non-locality of the quantum field, especially if new entangled states ensue - as opposed to describing it as “decoherence”. But that may be what you were getting at.

I mean is Cauchy surface reduced? Or is the lab just - moved along with it - even if the detectors are saying “we just decohered that thing didn’t we” isn’t it (the entangled surface of “now”) just sitting there all up in them?

I get that the “prepared” entanglement is decohered. But what I’m curious about is the dynamics of natural uncontrolled, un-prepared entanglement. If the former realizes non-locality and faster than light superposition rule enforcement, how are those manifest in the natural evolution of the Cauchy surface.

Is there a specific notion of entanglement conservation?
 
Last edited:
  • #175
The thought experiment of Roger Penrose, as mentioned above, shows that a single photon makes the photon detectors D1,... in its neighborhood "entangled" in the sense that if one detector clicked for the photon, then the others do not click for that same photon. Nature is obviously full of entanglement. In the Penrose example, entanglement is a result of "the photon A could have interacted with detectors D1,...".

How can one erase the Penrose entanglement? The detectors D1,... should forget what they might have measured. It is impossible with macroscopic objects.
 
  • Like
Likes Jimster41

Similar threads

  • Quantum Interpretations and Foundations
2
Replies
37
Views
1K
  • Quantum Interpretations and Foundations
Replies
6
Views
1K
  • Quantum Interpretations and Foundations
25
Replies
874
Views
30K
  • Quantum Interpretations and Foundations
6
Replies
175
Views
6K
  • Quantum Interpretations and Foundations
2
Replies
37
Views
2K
  • Quantum Interpretations and Foundations
7
Replies
244
Views
7K
  • Quantum Interpretations and Foundations
Replies
17
Views
2K
  • Quantum Interpretations and Foundations
4
Replies
138
Views
5K
  • Quantum Interpretations and Foundations
10
Replies
333
Views
11K
  • Quantum Interpretations and Foundations
3
Replies
76
Views
5K
Back
Top