A Do entanglement networks encode a unique, local historical record?

  • A
  • Thread starter Thread starter asklepian
  • Start date Start date
  • Tags Tags
    Decoherence
Click For Summary
The discussion centers on the paper by Al-Khalili and Chen, which suggests that the early universe had low entanglement entropy that increases over time, leading to a decoherent arrow of time. Decoherence is described as the process that transforms an indeterminate quantum system into a determined state through environmental interactions, thereby encoding a unique local historical record. The conversation also explores how light cones may represent networks of quantum entanglements, reflecting historical interactions within their causal regions, aligning with concepts from general relativity. Additionally, the potential implications of superluminal travel on decoherence and causality violations are examined, questioning how modern theoretical frameworks address these scenarios. The overall theme emphasizes the relationship between quantum mechanics, entropy, and the nature of time in understanding historical records in quantum systems.
asklepian
Messages
5
Reaction score
0
TL;DR
How does decoherence transition quantum systems to classical states by encoding environmental interactions? Can light cones be seen as local entanglement networks reflecting spacetime's causal history? Is superluminal travel barred by decoherence and causality issues?
In the recent paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, the authors propose that the early universe had very low entanglement entropy, which increases over time and gives rise to the decoherent arrow of time.

1.) Can you elaborate on how decoherence functions as the process by which an initially indeterminate quantum system becomes effectively determined through interactions with its environment? Specifically, how does this process encode information about the system's state into the environment, thus creating a unique local historical record of these interactions? Additionally, what are the quantitative measures, such as von Neumann entropy or entanglement entropy, that can be used to describe this transition from a quantum superposition to a classical mixture?

2.) Considering that light cones define regions of spacetime within which causal interactions can occur, can we interpret these regions as encoding a local network of quantum entanglements that reflect the unique historical interactions within each light cone? How does this interpretation align with our current understanding of quantum entanglement and the causal structure of spacetime in the framework of general relativity? Are there any existing models or quantitative frameworks that describe this relationship?

3.) Is the prohibition of superluminal travel related to increased decoherence in regions of high energy density, potentially disrupting local entanglement correlations and leading to indeterminate states and causality violations? If superluminal travel were hypothetically possible, would superluminal disentanglement and subsequent subluminal reentanglement allow the encoding of a 'new' history, leading to causality violations? How do modern theoretical frameworks, such as quantum field theory and general relativity, quantitatively address these issues?
 
Last edited:
Physics news on Phys.org
asklepian said:
In the recent paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, the authors propose that the early universe had very low entanglement entropy, which increases over time and gives rise to the decoherent arrow of time.

1.) Can you elaborate on how decoherence functions as the process by which an initially indeterminate quantum system becomes effectively determined through interactions with its environment? Specifically, how does this process encode information about the system's state into the environment, thus creating a unique local historical record of these interactions? Additionally, what are the quantitative measures, such as von Neumann entropy or entanglement entropy, that can be used to describe this transition from a quantum superposition to a classical mixture?
The authors reference Hartle and Gell-Mann. They are a good place to start. https://arxiv.org/pdf/gr-qc/9509054 In this paper they discuss how "classicality" might be quantified.

Indeterminacy is only eliminated in the sense that each possible history from an appropriate set will exhibit time correlations well-approximated by classical physics. The resolution of a unique history by our measurements is still only given probabilistically.

[edit] - The above paper by Gell-Mann and Hartle references literature of theirs exploring classicality (or classicity) that might be better to start with: https://arxiv.org/abs/1803.04605 and https://arxiv.org/abs/gr-qc/9210010 and https://arxiv.org/abs/1905.05859
 
Last edited:
For a bit of a different perspective, I would also recommend The arrow of time in operational formulations of quantum theory by Di Biagio, Donà, and Rovelli:
https://arxiv.org/abs/2010.05734

The operational formulations of quantum theory are drastically time oriented. However, to the best of our knowledge, microscopic physics is time-symmetric. We address this tension by showing that the asymmetry of the operational formulations does not reflect a fundamental time-orientation of physics. Instead, it stems from built-in assumptions about the users of the theory. In particular, these formalisms are designed for predicting the future based on information about the past, and the main mathematical objects contain implicit assumption about the past, but not about the future. The main asymmetry in quantum theory is the difference between knowns and unknowns.
 
We often see discussions about what QM and QFT mean, but hardly anything on just how fundamental they are to much of physics. To rectify that, see the following; https://www.cambridge.org/engage/api-gateway/coe/assets/orp/resource/item/66a6a6005101a2ffa86cdd48/original/a-derivation-of-maxwell-s-equations-from-first-principles.pdf 'Somewhat magically, if one then applies local gauge invariance to the Dirac Lagrangian, a field appears, and from this field it is possible to derive Maxwell’s...