Graduate Reconciling Determinism, Entropy, and Quantum Statistics

  • Thread starter Thread starter ahmidi
  • Start date Start date
Click For Summary
The discussion revolves around deterministic causal-graph models, focusing on two main questions: the monotonicity of entropy in deterministic systems and the implications of the Tsirelson bound in relation to superdeterministic models. Participants explore whether modern proofs exist for a deterministic second law of thermodynamics without coarse-graining and discuss the potential for deterministic models to saturate quantum bounds while violating the Measurement Independence assumption in Bell's theorem. The conversation highlights the complexity of algorithmic growth in deterministic systems, emphasizing the distinction between global state complexity and local state-history complexity. It concludes with a consideration of how complexity may increase over time, particularly when analyzing the entire history of a system. The exploration of these concepts aims to reconcile determinism with quantum statistical behaviors.
ahmidi
Messages
3
Reaction score
0
TL;DR
Looking for papers on two things: (1) fully deterministic systems where Kolmogorov-complexity entropy provably rises every step, no coarse-graining, just algorithmic arguments (beyond Gács 2023 or ’t Hooft); (2) deterministic, signal-local models that hit the CHSH Tsirelson limit (2√2) by violating measurement independence rather than locality. Any leads appreciated!
Hi everyone. I’m exploring deterministic causal-graph models and have two literature questions.
(1) Entropy: In a finite automaton an observer restricted to a coarse region sees Kolmogorov-complexity entropy rise each update. M. Gács (“The Algorithmic Second Law of Thermodynamics,” Entropy 25) and ’t Hooft discuss similar ideas. Are there modern proofs of a monotone second law that stay fully deterministic and avoid coarse-graining?
(2) Tsirelson bound: This is the part that really puzzles me. My local, synchronous update model outputs CHSH = 2√2 when averaged over its hidden states. The potential loophole is that the choice of measurement settings is correlated with the system's pre-existing local state information, which could be a violation of the Measurement Independence assumption in Bell's theorem. Are there known deterministic models that can saturate the quantum bound specifically via this kind of loophole?
Any pointers or critique welcome, thanks!
 
Physics news on Phys.org
About (2), there are superdeterministic hidden variable models which claim to reproduce QM without violating locality. My own opinion is that such models are in fact non-local in disguise. Anyway, you can google works of S. Hossenfelder about superdeterminism, it may be similar to your own ideas.
 
  • Like
Likes ohwilleke, javisot and Lord Jestocost
Thanks, I'm aware of her work and am definitely in the same superdeterministic camp. Has anyone already analysed algorithmic (Kolmogorov) entropy growth step-by-step inside these SD/CA frameworks?
 
About (1), since the system is deterministic, the state at any time is determined by the state at the initial time. Hence, the same algorithm (namely, the same laws of evolution plus the same initial state) determines the state at any time. How can that be compatible with the rise of algorithmic complexity? Perhaps the initial algorithmic complexity may in fact be lower because the initial condition can be compressed (e.g. the initial condition "01010101010101010101010101010101010101010101" can be compressed in an obvious way), in which case I can imagine that the algorithmic complexity may rise for a while. But I don't see how such a rise could be monotone, because sooner or later I expect that the state at a certain time is so complex that it cannot longer be compressed, after which the algorithmic complexity cannot rise any more. Do I miss something?
 
Yes of course, for the deterministic system as a whole, the global state's complexity is static. The subtlety is in the distinction between the complexity of that global state versus the complexity of the state-history recorded locally within any finite region of the system. The idea i'm exploring specifically is whether the algorithmic complexity of any finite region's history must increase at every step because the deterministic evolution rule continuously propagates new, incompressible information from the surroundings into that region.
 
If you consider the instantaneous states at different times, there is no reason for complexity to grow forever. After a finite time one excepts that the system approaches an equilibrium, after which the complexity will be more-or-less the same. After a very long time one expects something like Poincare recurrence, so the complexity will even decrease.

But if you consider the complexity of the whole history, including all its states at earlier times, then yes, I think it should grow with time forever.
 
Last edited:
Manuel S Morales said:
The APS presentation reference
The Method of Everything manuscript reference
Neither of these are relevant to the topic of this thread.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 6 ·
Replies
6
Views
842
  • · Replies 37 ·
2
Replies
37
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
Replies
2
Views
4K
  • · Replies 163 ·
6
Replies
163
Views
27K
  • · Replies 24 ·
Replies
24
Views
5K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 28 ·
Replies
28
Views
5K