Could entanglement arise from the continuity of the initial universal singularity?
Initial universal singularity is sort of physically meaningless isn't it? Suppose the universe began as a more or less spherical volume encompassing 10's or 100's of cubic light years. Suppose ... well the speculative possibilities are seemingly infinite (even if the universe isn't).
There is some rather more direct physical evidence regarding the apparent isotropic expansion. So, suppose you take that as the fundamental physical fact. Can you connect entanglement to it?
Would it be possible for you to describe your conception of what entanglement is ... qualitatively?
Suppose entanglement operates at the speed of inflation. Guth's hypothesis would then also account for quantum statistical correlation.
Is there a reason why you keep throwing out weird things like this without bothering to DEFINE what you mean?
The EPR experiment was claimed to show quantum mechanics' "spooky action at a distance." The first mechanism to be thrown out in explaining this phenomenon is usually superluminal signaling. Can inflationary velocities, though, be modeled to corroborate quantum statistics?
According to the Big Bang theory, spacetime is supposed to have originated from a singularity. Could not that all spacetime has a common origin tend all events to be in phase, evoking the observed correlations of entanglement?
Or you can opt for the incompleteness of qm as a description of physical reality, which iirc is what EPR did.
If "thrown out" means adopted or posited, then it's adopted only because there's no precise description, in terms of local interactions, of how quantum correlations happen, and also because the qm formalism is interpreted by some to imply FTL or non-locality.
If "thrown out" means thrown out, then that's what should, imo, be done because the statistical relationships can be understood within the limits of the principle of locality, and because the qm formalism isn't required to be interpreted as implying FTL or non-locality.
Well, we're not living in an inflationary epoch, so it's hard to imagine how these velocities would be pertinent now.
What do you mean by "all events to be in phase" ?
iirc, one of the reasons for the inflation idea was the question of how the cosmic microwave background radiation could be so, apparently, nearly isotropic if the areas from which the radiation was originally emitted were spacelike separated (ie., how could what looks like the thermal equilibrium of the cmbr have happened between areas that couldn't have communicated with each other given the assumption that the speed of light is a universal limit.
This seems like sort of the same as the problem of how quantum entangled statistics are produced by spacelike separated events at two detectors, A and B, if, assuming the principle of locality holds, A and B aren't communicating with each other via FTL transmissions of some sort.
While a detailed, qualitative description of the precise physical mechanisms/interactions isn't possible, it is nevertheless possible to comprehend how the quantum correlations could happen without positing FTL by focusing on what the disturbances incident on, say, polarizers during a given coincidence interval have in common. Polarization itself might have less to do with it than, say, the relationship between the phase and amplitude of disturbances that are analyzed by the polarizers during a given interval. I don't know. Anyway, A and B apparently aren't affecting each other because the rate of detection at A doesn't vary with changes at B, and vice versa. So, until there is some better reason than just some interpretation of the formalism to support the idea that FTL effects are actually happening, the FTL problem wrt EPR, Bell test, etc. scenarios is a pseudo-problem, and the assumption of locality in nature can be retained.
The isotropy of the cmbr might be understood, without positing an inflationary epoch, as being due to a common universal expansion that more closely approximates what we measure as the speed of light than a temporary inflationary expansion that was many orders of magnitude greater than the speed of light. The universe didn't have to begin as a singularity, or as an extremely small (subatomic scale), finite volume. The observable matter in the universe, even at the largest scales, isn't precisely uniformly distributed, and yet there is a sort of pattern to it (There are sort of string-like radiant structures -- super galactic ensembles? -- connected in a sort of lattice-like overall structure, with vast, apparent voids in between the radiant structures --- which suggests to me that the beginning was a really big and irregularly distributed disturbance. That is, it was a vast and lumpy, and finite, volume to begin with, and the energy of the expansion has been stretching the large substructures apart ever since).
Another isomorphism (in addition to the expansion) relating the spacelike separated evolutions of spatially separated regions might be a common, fundamental medium wrt which the entire universe is evolving toward equilibrium.
Anyway, right now, and wrt the observable universe, there seem to be good reasons to suppose that natural processes evolve at the speed of light, or less --- and it doesn't seem that we need Guth's hypothesis or anything FTL to understand how quantum correlations can happen (even if exactly how they happen can't be precisely described in any sort of visualizable way).
Answer me this: how EASY do you think it is for us to maintain the entanglement of a SIMPLE, bipartite system, before all traces of the entanglement are lost via decoherence?
I would guess that decoherence implies interference within a system more complex than bipartite. A bipartite system itself still remains entangled when measured once by an outside observer.
I posit that even myriad events evolving in totality from a mutual singularity maintain a measurable semblance of entanglement.
Er... "interference"? Are you using a layman's term, or a physics term, when you use such a word?
You did not answer my question. How EASY is it to maintain the entanglement in a SIMPLE bipartite system. Show me experimental evidence to support whatever it is you "posit".
It is relatively easy to maintain the entanglement of a bipartite system until it is disturbed by (have its wavefunction physically interfered with the wavefunction of) a macroscopic measuring device or correspondent environment.
Several banks are purportedly using entanglement to transmit simple codes without reasonable possibility of evesdropping. If intercepted, such a code (bipartite based, if you wish) would collapse into a random sequence.
Then tell me this: how EASY it is for a macroscopic system to interact with the entangled system? If entanglement is THAT easy to maintain, how come we don't see entanglement occuring all around us? Someone looking at you should easily see an entangled observation elsewhere. Where is it? Even with a system that is so weakly interacting, such as photons, we hardly can detect it entanglement. Electrons? How far/long do you think we can maintain the entanglement of a pair of electrons?
Then LEARN from that. Decoherence destroys any traces of entanglement! And decoherence happens EXTREMELY EASILY. Our observation tells us entanglement is difficult to maintain, and it can be easily destroyed. Now go back to your original question and ask yourself why you have ignored this fact.
Separate names with a comma.