What is the mechanism behind Quantum Entanglement?

In summary: Locality means that the effect and the cause have to be within the same vicinity.Both of these assumptions hold true for all other aspects of physics.Yet, at least one of them must not be universally true or quantum entanglement would not give rise to the phenomena that we observe.There are a variety of speculative hypotheses for the mechanism of quantum entanglement, but none of them can be singled out as correct with existing experiments.
  • #246
WernerQH said:
Quantum theory provides a description that makes these objects hard to recognize. :smile:
One solution is to choose to recognize the description as primary only and ask how the dynamics of the description itself is expected to change and what happens when two such descriptions are compared. After some time the description and what it describes is indistinguishable 😮

/Fredrik
 
Physics news on Phys.org
  • #247
Hornbein said:
Susskind supposes entanglement could be a wormhole.

ER=EPR
quantum entanglement from holographic principle.

.
 
  • #248
vanhees71 said:
There is no cut between a classical and a quantum world within QT, and there's no empirical evidence for one to exist in nature. This is, however, under investigation, i.e., there are experiments going on testing the (im)possibility to demonstrate "quantum behavior" of ever larger objects.
A small addition to your point. I think it should be remembered that classical quantities can be handled within a quantum theory.

For example if we consider electric charge, it has an associated observable ##\hat{C}##. However electric charge is purely classical, since all operators commute with it and so it has no interference effects. However this is perfectly handled within quantum electrodynamics.

So if certain macroscopic quantities don't display interference effects, but there's no issue with quantum theory modelling them. You'd just need to prove that operators not commuting with them are unphysical, as is the case with charge. This is in fact how older textbooks handled the classicality of macroscopic quantities without decoherence.
 
  • Like
  • Informative
Likes Dragrath, vanhees71 and PeroK
  • #249
That's equivalent to the assumption of a charge superselection rule. The one thing which is "quantum" about charges is that they come in multiples of ##e## (or ##e/3## if you count unobservable quarks). Why the elementary particles have the observed pattern is only vaguely known. It's at least one pattern that admits a chiral local gauge symmetry for the weak interaction, i.e., it's making this model anomaly free.
 
  • Like
Likes ohwilleke, gentzen and LittleSchwinger
  • #250
vanhees71 said:
That's equivalent to the assumption of a charge superselection rule
There's recent work showing more quantities than one would think are superselected in QFT.

For instance for a bosonic field one can show that although ##\phi(f)## is an observable, quantities like ##\phi(f)^{n}## are not despite being self-adjoint. This shrinks the algebra of observables from ##\mathcal{B}(\mathcal{H})## to a smaller subset ##\mathcal{A}## resulting in several operators being classical as they "lose" the operators they don't commute with.

This is an example:
https://arxiv.org/abs/2106.09027

The work of Sewell cited in that paper has more examples.
 
  • Like
  • Informative
Likes Dragrath, bhobba and vanhees71
  • #251
Regarding the thread title I would say the following.

Quantum theory generalises classical probability theory by not assuming the existence of states where all quantities take well-defined values. When you generalise probability theory like this it turns out one can prepare systems with stronger correlations than classical theory, these stronger correlations being permitted precisely because not all quantities have well-defined values.

This can be directly seen in the CHSH experiment where we have two particles ##A## and ##B## and two spin observables on each, i.e. ##A_{1}, A_{2}##.

Then consider the CHSH operator:
##C = \frac{1}{2}\left(A_{1}\left(B_{1}+B_{2}\right) + A_{2}\left(B_{1}-B_{2}\right)\right)##

The CHSH inequality is directly equivalent to the statement that:
##C^{2} \leq 1##

This is true in classical probability. However in quantum theory we have:
##C^{2} = 1 + \frac{1}{4}[A_{1},A_{2}][B_{1},B_{2}]##

So the hidden-variable violations are purely just due to local incompatibility of the observables. That's the explanation.
 
Last edited:
  • Like
  • Love
  • Informative
Likes kered rettop, Dragrath, bhobba and 2 others
  • #252
LittleSchwinger said:
So the hidden-variable violations are purely just due to local incompatibility of the observables. That's the explanation.
As I see it this is the description of the original problem, not the explanation? But I think we have are different forms of explanatory forms we seek.

I would think the question is then "why does nature make use of such incompatible observables"? We know it does, but the question is: is there a deeper way to understand why? Or are we satisfied with that it's a conincidence? (fine tuning strikes again)

Some may say we do not know, as it's to ask "what does nature obey" QM. But I think there may be evolutionary answers, it's like to in biology, why the cell membrane composition is what it is. We can not find a explanation to that in the form of a initial value problem. But we can perhaps see the survival value of certain membrane properties. For complex systems newtonian paradigm "explanations" based on ultimate reductionism typically are useless, but that does not mean there is nothing to do.

There are association to game theory where one can often see that quantum strategies are simply superior to classical strategies. That essential refers to the inference or predictive logic that the agents make use of. And agents encoding the superior strategies will likely dominate. I think this is the sort of explanations to see, or it's the kind of explanations I seek at least.

Examples of the line of thought is here.

Quantum Strategies Win in a Defector-Dominated Population​

"Quantum strategies are introduced into evolutionary games. The agents using quantum strategies are regarded as invaders whose fraction generally is 1% of a population in contrast to the 50% defectors. In this paper, the evolution of strategies on networks is investigated in a defector-dominated population, when three networks (Regular Lattice, Newman-Watts small world network, scale-free network) are constructed and three games (Prisoners' Dilemma, Snowdrift, Stag-Hunt) are employed. As far as these three games are concerned, the results show that quantum strategies can always invade the population successfully. Comparing the three networks, we find that the regular lattice is most easily invaded by agents that adopt quantum strategies. However, for a scale-free network it can be invaded by agents adopting quantum strategies only if a hub is occupied by an agent with a quantum strategy or if the fraction of agents with quantum strategies in the population is significant. "
-- https://arxiv.org/abs/1112.0713

Of course how such games related to reconstructing the foundations or unification of law gets more complicated to imagine, but I think it is a good direction to look further.

As I see it, agents using quantum logic (meaning entertaing non-commucative information structures) are more efficient than a classical counterpart. Set side abstract proofs, it's I think not hard to see intuitively as well why this makes sense.

/Fredrik
 
  • #253
vanhees71 said:
That's equivalent to the assumption of a charge superselection rule
Sorry, only considered this now. In QED charge superselection can be proven as a theorem, rather than assumed. Of course you may have meant assumption in a casual sense.
 
  • Like
Likes Dragrath, bhobba, ohwilleke and 1 other person
  • #254
Hornbein said:
Susskind supposes entanglement could be a wormhole.
This may actually tie in quite naturally with the paper I linked earlier https://iopscience.iop.org/article/10.1088/1475-7516/2016/10/022/meta.

In particular the proof of the No big crunch theorem shows that for any sufficiently large i.e. approximately flat or open initially expanding universe that the interaction between expansion, which depends on the relative rate of time evolution locally due to spacetime curvature in any given region of space and a nontrivial inhomogeneous distribution of matter. This emerges naturally in the context of conservation of information since the initial conditions of such an expanding universe means that information of those initial conditions in any given region of space will have long since been carried out beyond the local observable universe due to expansion.

But for the entanglement understanding what looks to be important is if you due a limit analysis this tells you that the off diagonal components of the metric tensor everywhere are both nonzero and asymmetric else the Einstein field equations can not be internally self consistent for all possible choices which notably is equivalent to saying information conservation in the context of any system of differential equations. There is actually a pretty profound inference in this proof namely that thanks to the affect of curvature on expansion the off diagonal components of the metric tensor are both nonzero and asymmetric everywhere in such an expanding universe lets us identify some properties of the inhomogenous and anisotropic Einstein field equations.

(As an aside I should note metamathematical meaning the proof effectively shows that there are no possible valid solutions that can be constructed for such a sufficiently large initially expanding universe which will ever permit a collapse or even a large scale slowing of expansion because such a solution if it existed would require mutually incompatible and thus inconsistent properties within its metric tensor. This has many consequences beyond the extent of this discussion which in light of what we observe of our universe can drastically reduce the complexity of cosmological models as an overall acceleration of expansion like what cosmologists currently attribute to "dark energy" becomes a trivial unavoidable property of any valid metric)

In particular for this discussion however you may remember that an integral sum over anisotropic wavefunctions in quantum mechanics is the fundamental origin of what we call Fermi Dirac statistics or Dirac spinors thus if the metric of space itself has this property then this means there should be a spacetime equivalent to the Pauli exclusion principal in the limit of the size of a metric element becoming sufficiently small(i.e. approaching any quantum limit should one exist).

Such spinor states would naturally based on how they were derived constitute informational pairs of the initial conditions which represent the causal coorelations of past events and past light cone. Naturally if the metric at all points in spacetime is uniquely defined(which it must be for an expanding universe in the limit where the size of said universe is approaching infinitely large i.e. volume much larger than the rate of expansion) then this also implies the arrow of time as a trivial consequence and it also satisfies bells inequality in that it is now non locally correlated and thus information conservation or rather information theory and the past light cone naturally constitute hidden variables for a "real" but nonlocal informational system.

These two required properties for internal consistency of nonlocality and uniqueness(of spinor pairs) are both properties associated in quantum mechanics with what we define as quantum mechanics which in this case emerge automatically as consequences of information conservation within a large initially expanding inhomogeneous and anisotropic universe. In the context of the gravitational path integral concept it naturally apparent that since the metric represents a sum of such elements that some variation of ER=EPR, provided that the wormholes in question are imaginary, is the only way to ensure the Einstein field equations remain internally consistent for all possible choices of initial conditions, i.e. mathematically valid. Any other metric will as shown by the proof of the no big crunch theorem will always have irreducible internal inconsistencies. Note that the interaction with expansion means that even an approximately homogeneous and isotropic metric with only small deviations can not stay homogenous and isotropic without violating conservation of information and or causality since the proof in question shows that the homogeneous and isotropic solution while valid is an unstable i.e. all possible deviations from perfect homogeneity and isotropy will lead to mathematically divergent behavior from this inflection point making it very much a mathematical peak in the 4D higher dimensional spacetime where the Einstein field equations live.

While it is hard if not impossible to prove such a relationship this may come as close as possible as the inhomogenous and anisotropic large scale limit represents the general behavior of the full unconstrained Einstein field equations in this limit. A full more complete derivation is likely needed for more rigor but what particularly should get more attention is that this criterion comes from the mathematical formalism of the Einstein field equations themselves.

This matters especially in light of the current "crises in cosmology" since all of these would be relatively trivially resolved by dropping the so called cosmological principal assumption(as it is and has always been assumed as a starting lemma without observational proof). (remember that the relationship between redshift and distance is a model dependent parameter and thus distance relationships as well as the extremely common partice of treating all locations on the sky as equivalent which would only be true if the universe was homogeneous and isotropic

There is however an observational test in light of the CMB dipole which can test the model assumption that this dipole is purely kinematic which any form of the cosmological principal requires in order to not be outright falsified for the entire observable universe by the existence of a nonzero CMB dipole. This was proposed back in 1984 by Ellis & Baldwin and involves the construction of a dipole of cosmologically distant sources which should the dipole be kinematic and thus purely due to our local frame of reference should be indistinguishable in both magnitude and direction. This was first tested with a statistically significant sample size by Nathan J. Secrest et al 2021 ApJL 908 L51 using 1.36 quasars from the catWISE catalog their results showed a large deviation in this dipole from the CMB dipole of 4.9 sigma significance or a 1 in 2 million chance of being a coincidence.

Moreover now on ArXiv an independent team has performed their own follow up analysis and found an even higher statistical significance of 5.7 sigma which brings this above the 5 sigma threshold if their work holds up. This finding that the dipole has a significant nonzero cosmological component notably also trivially resolves and or is supported by the previous unresolved crisis in cosmology related to the alignment of the CMB dipole with the higher multipoles of the CMB and puts our observable universe well within the bounds where the No big crunch theorem applies.

Therefore if this holds up this looks to be as close to an answer as we could ever hope to get thanks to the limits of knowledge imposed by Godel's incompleteness theorems onto any self referential system.

It may just also happen to establish a natural trajectory towards quantum gravity where gravity and the cumulative effect of any and all possible entanglement and decoherence states are synonymous along with what has traditionally been labeled "dark energy" as the asymmetric component of the metric.
 

Similar threads

  • Quantum Interpretations and Foundations
2
Replies
54
Views
844
  • Quantum Interpretations and Foundations
Replies
10
Views
144
  • Quantum Interpretations and Foundations
3
Replies
87
Views
2K
  • Quantum Interpretations and Foundations
3
Replies
96
Views
4K
  • Quantum Interpretations and Foundations
Replies
13
Views
2K
  • Quantum Interpretations and Foundations
7
Replies
244
Views
7K
  • Quantum Interpretations and Foundations
2
Replies
44
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
54
Views
3K
  • Quantum Interpretations and Foundations
3
Replies
79
Views
5K
  • Quantum Interpretations and Foundations
6
Replies
175
Views
6K
Back
Top