Reconciling Determinism, Entropy, and Quantum Statistics

  • Context: Graduate 
  • Thread starter Thread starter ahmidi
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around the reconciliation of determinism, entropy, and quantum statistics, focusing on deterministic causal-graph models. Participants explore the implications of algorithmic complexity in deterministic systems and the potential for superdeterministic models to explain quantum phenomena without violating locality.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant inquires about modern proofs of a monotone second law of thermodynamics that remain fully deterministic and avoid coarse-graining, referencing works by M. Gács and ’t Hooft.
  • Another participant mentions superdeterministic hidden variable models that claim to reproduce quantum mechanics without violating locality, expressing skepticism about their non-locality.
  • A participant expresses interest in analyzing algorithmic entropy growth within superdeterministic and cellular automata frameworks.
  • Concerns are raised about the compatibility of deterministic systems with the rise of algorithmic complexity, suggesting that initial conditions may allow for temporary increases in complexity but questioning the monotonicity of such growth.
  • Another participant clarifies that while the global state's complexity may be static, the complexity of local state histories could increase due to the propagation of new information from the surroundings.
  • One participant argues that complexity may not grow indefinitely, as systems are expected to reach equilibrium, while acknowledging that the complexity of the entire history should grow over time.

Areas of Agreement / Disagreement

Participants express differing views on the nature of complexity in deterministic systems, with some suggesting that complexity can rise temporarily while others argue for eventual equilibrium. The discussion remains unresolved regarding the implications of superdeterminism and the behavior of algorithmic complexity.

Contextual Notes

Participants highlight the need for clarity on the definitions of complexity and the assumptions underlying their arguments, particularly in relation to the behavior of deterministic systems over time.

ahmidi
Messages
3
Reaction score
0
TL;DR
Looking for papers on two things: (1) fully deterministic systems where Kolmogorov-complexity entropy provably rises every step, no coarse-graining, just algorithmic arguments (beyond Gács 2023 or ’t Hooft); (2) deterministic, signal-local models that hit the CHSH Tsirelson limit (2√2) by violating measurement independence rather than locality. Any leads appreciated!
Hi everyone. I’m exploring deterministic causal-graph models and have two literature questions.
(1) Entropy: In a finite automaton an observer restricted to a coarse region sees Kolmogorov-complexity entropy rise each update. M. Gács (“The Algorithmic Second Law of Thermodynamics,” Entropy 25) and ’t Hooft discuss similar ideas. Are there modern proofs of a monotone second law that stay fully deterministic and avoid coarse-graining?
(2) Tsirelson bound: This is the part that really puzzles me. My local, synchronous update model outputs CHSH = 2√2 when averaged over its hidden states. The potential loophole is that the choice of measurement settings is correlated with the system's pre-existing local state information, which could be a violation of the Measurement Independence assumption in Bell's theorem. Are there known deterministic models that can saturate the quantum bound specifically via this kind of loophole?
Any pointers or critique welcome, thanks!
 
Physics news on Phys.org
About (2), there are superdeterministic hidden variable models which claim to reproduce QM without violating locality. My own opinion is that such models are in fact non-local in disguise. Anyway, you can google works of S. Hossenfelder about superdeterminism, it may be similar to your own ideas.
 
  • Like
Likes   Reactions: ohwilleke, javisot and Lord Jestocost
Thanks, I'm aware of her work and am definitely in the same superdeterministic camp. Has anyone already analysed algorithmic (Kolmogorov) entropy growth step-by-step inside these SD/CA frameworks?
 
About (1), since the system is deterministic, the state at any time is determined by the state at the initial time. Hence, the same algorithm (namely, the same laws of evolution plus the same initial state) determines the state at any time. How can that be compatible with the rise of algorithmic complexity? Perhaps the initial algorithmic complexity may in fact be lower because the initial condition can be compressed (e.g. the initial condition "01010101010101010101010101010101010101010101" can be compressed in an obvious way), in which case I can imagine that the algorithmic complexity may rise for a while. But I don't see how such a rise could be monotone, because sooner or later I expect that the state at a certain time is so complex that it cannot longer be compressed, after which the algorithmic complexity cannot rise any more. Do I miss something?
 
Yes of course, for the deterministic system as a whole, the global state's complexity is static. The subtlety is in the distinction between the complexity of that global state versus the complexity of the state-history recorded locally within any finite region of the system. The idea i'm exploring specifically is whether the algorithmic complexity of any finite region's history must increase at every step because the deterministic evolution rule continuously propagates new, incompressible information from the surroundings into that region.
 
If you consider the instantaneous states at different times, there is no reason for complexity to grow forever. After a finite time one excepts that the system approaches an equilibrium, after which the complexity will be more-or-less the same. After a very long time one expects something like Poincare recurrence, so the complexity will even decrease.

But if you consider the complexity of the whole history, including all its states at earlier times, then yes, I think it should grow with time forever.
 
Last edited:
Manuel S Morales said:
The APS presentation reference
The Method of Everything manuscript reference
Neither of these are relevant to the topic of this thread.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 34 ·
2
Replies
34
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 37 ·
2
Replies
37
Views
7K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 163 ·
6
Replies
163
Views
28K
  • · Replies 2 ·
Replies
2
Views
3K