Graduate Do entanglement networks encode a unique, local historical record?

  • Thread starter Thread starter asklepian
  • Start date Start date
  • Tags Tags
    Decoherence
Click For Summary
SUMMARY

The discussion centers on the paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, which posits that the early universe had low entanglement entropy that increases over time, establishing a decoherent arrow of time. Key concepts include decoherence, which transforms an indeterminate quantum system into a determined state through environmental interactions, and the encoding of historical information via measures like von Neumann entropy and entanglement entropy. Additionally, the relationship between light cones and local quantum entanglements is explored, alongside the implications of superluminal travel on decoherence and causality, referencing frameworks such as quantum field theory and general relativity.

PREREQUISITES
  • Understanding of decoherence in quantum mechanics
  • Familiarity with entanglement entropy and von Neumann entropy
  • Knowledge of light cones in the context of spacetime
  • Basic principles of quantum field theory and general relativity
NEXT STEPS
  • Research the role of decoherence in quantum systems and its implications for classicality
  • Study the works of Hartle and Gell-Mann on classicality and quantum history
  • Examine the relationship between light cones and quantum entanglement in general relativity
  • Explore the implications of superluminal travel on quantum entanglement and causality
USEFUL FOR

Physicists, quantum theorists, and researchers interested in the foundations of quantum mechanics, the nature of time, and the interplay between quantum entanglement and spacetime structure.

asklepian
Messages
5
Reaction score
0
TL;DR
How does decoherence transition quantum systems to classical states by encoding environmental interactions? Can light cones be seen as local entanglement networks reflecting spacetime's causal history? Is superluminal travel barred by decoherence and causality issues?
In the recent paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, the authors propose that the early universe had very low entanglement entropy, which increases over time and gives rise to the decoherent arrow of time.

1.) Can you elaborate on how decoherence functions as the process by which an initially indeterminate quantum system becomes effectively determined through interactions with its environment? Specifically, how does this process encode information about the system's state into the environment, thus creating a unique local historical record of these interactions? Additionally, what are the quantitative measures, such as von Neumann entropy or entanglement entropy, that can be used to describe this transition from a quantum superposition to a classical mixture?

2.) Considering that light cones define regions of spacetime within which causal interactions can occur, can we interpret these regions as encoding a local network of quantum entanglements that reflect the unique historical interactions within each light cone? How does this interpretation align with our current understanding of quantum entanglement and the causal structure of spacetime in the framework of general relativity? Are there any existing models or quantitative frameworks that describe this relationship?

3.) Is the prohibition of superluminal travel related to increased decoherence in regions of high energy density, potentially disrupting local entanglement correlations and leading to indeterminate states and causality violations? If superluminal travel were hypothetically possible, would superluminal disentanglement and subsequent subluminal reentanglement allow the encoding of a 'new' history, leading to causality violations? How do modern theoretical frameworks, such as quantum field theory and general relativity, quantitatively address these issues?
 
Last edited:
Physics news on Phys.org
asklepian said:
In the recent paper "The Decoherent Arrow of Time and the Entanglement Past Hypothesis" by Al-Khalili and Chen, the authors propose that the early universe had very low entanglement entropy, which increases over time and gives rise to the decoherent arrow of time.

1.) Can you elaborate on how decoherence functions as the process by which an initially indeterminate quantum system becomes effectively determined through interactions with its environment? Specifically, how does this process encode information about the system's state into the environment, thus creating a unique local historical record of these interactions? Additionally, what are the quantitative measures, such as von Neumann entropy or entanglement entropy, that can be used to describe this transition from a quantum superposition to a classical mixture?
The authors reference Hartle and Gell-Mann. They are a good place to start. https://arxiv.org/pdf/gr-qc/9509054 In this paper they discuss how "classicality" might be quantified.

Indeterminacy is only eliminated in the sense that each possible history from an appropriate set will exhibit time correlations well-approximated by classical physics. The resolution of a unique history by our measurements is still only given probabilistically.

[edit] - The above paper by Gell-Mann and Hartle references literature of theirs exploring classicality (or classicity) that might be better to start with: https://arxiv.org/abs/1803.04605 and https://arxiv.org/abs/gr-qc/9210010 and https://arxiv.org/abs/1905.05859
 
Last edited:
For a bit of a different perspective, I would also recommend The arrow of time in operational formulations of quantum theory by Di Biagio, Donà, and Rovelli:
https://arxiv.org/abs/2010.05734

The operational formulations of quantum theory are drastically time oriented. However, to the best of our knowledge, microscopic physics is time-symmetric. We address this tension by showing that the asymmetry of the operational formulations does not reflect a fundamental time-orientation of physics. Instead, it stems from built-in assumptions about the users of the theory. In particular, these formalisms are designed for predicting the future based on information about the past, and the main mathematical objects contain implicit assumption about the past, but not about the future. The main asymmetry in quantum theory is the difference between knowns and unknowns.
 
Time reversal invariant Hamiltonians must satisfy ##[H,\Theta]=0## where ##\Theta## is time reversal operator. However, in some texts (for example see Many-body Quantum Theory in Condensed Matter Physics an introduction, HENRIK BRUUS and KARSTEN FLENSBERG, Corrected version: 14 January 2016, section 7.1.4) the time reversal invariant condition is introduced as ##H=H^*##. How these two conditions are identical?

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 175 ·
6
Replies
175
Views
12K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
8K
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
4K
Replies
24
Views
8K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 40 ·
2
Replies
40
Views
8K