Hornbein said:
Susskind supposes entanglement could be a wormhole.
This may actually tie in quite naturally with the paper I linked earlier
https://iopscience.iop.org/article/10.1088/1475-7516/2016/10/022/meta.
In particular the proof of the No big crunch theorem shows that for any sufficiently large i.e. approximately flat or open initially expanding universe that the interaction between expansion, which depends on the relative rate of time evolution locally due to spacetime curvature in any given region of space and a nontrivial inhomogeneous distribution of matter. This emerges naturally in the context of conservation of information since the initial conditions of such an expanding universe means that information of those initial conditions in any given region of space will have long since been carried out beyond the local observable universe due to expansion.
But for the entanglement understanding what looks to be important is if you due a limit analysis this tells you that the off diagonal components of the metric tensor everywhere are both nonzero and asymmetric else the Einstein field equations can not be internally self consistent for all possible choices which notably is equivalent to saying information conservation in the context of any system of differential equations. There is actually a pretty profound inference in this proof namely that thanks to the affect of curvature on expansion the off diagonal components of the metric tensor are both nonzero and asymmetric everywhere in such an expanding universe lets us identify some properties of the inhomogenous and anisotropic Einstein field equations.
(As an aside I should note metamathematical meaning the proof effectively shows that there are no possible valid solutions that can be constructed for such a sufficiently large initially expanding universe which will ever permit a collapse or even a large scale slowing of expansion because such a solution if it existed would require mutually incompatible and thus inconsistent properties within its metric tensor. This has many consequences beyond the extent of this discussion which in light of what we observe of our universe can drastically reduce the complexity of cosmological models as an overall acceleration of expansion like what cosmologists currently attribute to "dark energy" becomes a trivial unavoidable property of any valid metric)
In particular for this discussion however you may remember that an integral sum over anisotropic wavefunctions in quantum mechanics is the fundamental origin of what we call Fermi Dirac statistics or Dirac spinors thus if the metric of space itself has this property then this means there should be a spacetime equivalent to the Pauli exclusion principal in the limit of the size of a metric element becoming sufficiently small(i.e. approaching any quantum limit should one exist).
Such spinor states would naturally based on how they were derived constitute informational pairs of the initial conditions which represent the causal coorelations of past events and past light cone. Naturally if the metric at all points in spacetime is uniquely defined(which it must be for an expanding universe in the limit where the size of said universe is approaching infinitely large i.e. volume much larger than the rate of expansion) then this also implies the arrow of time as a trivial consequence and it also satisfies bells inequality in that it is now non locally correlated and thus information conservation or rather information theory and the past light cone naturally constitute hidden variables for a "real" but nonlocal informational system.
These two required properties for internal consistency of nonlocality and uniqueness(of spinor pairs) are both properties associated in quantum mechanics with what we define as quantum mechanics which in this case emerge automatically as consequences of information conservation within a large initially expanding inhomogeneous and anisotropic universe. In the context of the gravitational path integral concept it naturally apparent that since the metric represents a sum of such elements that some variation of ER=EPR, provided that the wormholes in question are imaginary, is the only way to ensure the Einstein field equations remain internally consistent for all possible choices of initial conditions, i.e. mathematically valid. Any other metric will as shown by the proof of the no big crunch theorem will always have irreducible internal inconsistencies. Note that the interaction with expansion means that even an approximately homogeneous and isotropic metric with only small deviations can not stay homogenous and isotropic without violating conservation of information and or causality since the proof in question shows that the homogeneous and isotropic solution while valid is an unstable i.e. all possible deviations from perfect homogeneity and isotropy will lead to mathematically divergent behavior from this inflection point making it very much a mathematical peak in the 4D higher dimensional spacetime where the Einstein field equations live.
While it is hard if not impossible to prove such a relationship this may come as close as possible as the inhomogenous and anisotropic large scale limit represents the general behavior of the full unconstrained Einstein field equations in this limit. A full more complete derivation is likely needed for more rigor but what particularly should get more attention is that this criterion comes from the mathematical formalism of the Einstein field equations themselves.
This matters especially in light of the current "crises in cosmology" since all of these would be relatively trivially resolved by dropping the so called cosmological principal assumption(as it is and has always been assumed as a starting lemma without observational proof). (remember that the relationship between redshift and distance is a model dependent parameter and thus distance relationships as well as the extremely common partice of treating all locations on the sky as equivalent which would only be true if the universe was homogeneous and isotropic
There is however an observational test in light of the CMB dipole which can test the model assumption that this dipole is purely kinematic which any form of the cosmological principal requires in order to not be outright falsified for the entire observable universe by the existence of a nonzero CMB dipole. This was proposed back in 1984 by Ellis & Baldwin and involves the construction of a dipole of cosmologically distant sources which should the dipole be kinematic and thus purely due to our local frame of reference should be indistinguishable in both magnitude and direction. This was first tested with a statistically significant sample size by Nathan J. Secrest
et al 2021
ApJL 908 L51 using 1.36 quasars from the catWISE catalog their results showed a large deviation in this dipole from the CMB dipole of 4.9 sigma significance or a 1 in 2 million chance of being a coincidence.
Moreover now on ArXiv an independent team has performed their own follow up analysis and found an even higher statistical significance of 5.7 sigma which brings this above the 5 sigma threshold if their work holds up. This finding that the dipole has a significant nonzero cosmological component notably also trivially resolves and or is supported by the previous unresolved crisis in cosmology related to the alignment of the CMB dipole with the higher multipoles of the CMB and puts our observable universe well within the bounds where the No big crunch theorem applies.
Therefore if this holds up this looks to be as close to an answer as we could ever hope to get thanks to the limits of knowledge imposed by Godel's incompleteness theorems onto any self referential system.
It may just also happen to establish a natural trajectory towards quantum gravity where gravity and the cumulative effect of any and all possible entanglement and decoherence states are synonymous along with what has traditionally been labeled "dark energy" as the asymmetric component of the metric.