Bayesian Stat Mech reverses Entropy?

AI Thread Summary
Cosma Shalizi's paper presents a proof that using Bayesian probability in statistical mechanics leads to a non-increasing entropy, suggesting a reverse arrow of time. The discussion highlights skepticism about one of Shalizi's key assumptions linking information-theoretic entropy to thermodynamic entropy, which some participants believe is flawed. The conversation also critiques the application of Bayesian philosophy in physics, arguing it may overinterpret Bayes' Theorem and its implications. Additionally, the complexities of thermodynamics and the potential for misinterpretation in quantum mechanics are noted as significant issues. Overall, the thread emphasizes the need for careful consideration of assumptions in the intersection of probability theory and physics.
selfAdjoint
Staff Emeritus
Gold Member
Dearly Missed
Messages
6,843
Reaction score
11
Cosma Shalizi has a brief paper in the arxiv:

http://www.arxiv.org/PS_cache/cond-mat/pdf/0410/0410063.pdf

containing a proof that if you use Bayesian degree of belief oriented probability in forming statistical mechanics a la Jaynes, entropy comes out non-increasing; the arrow of time points backwards! Even if you replace full Bayesian statistics with the maximum entropy formalism, the result still holds. It looks like Shalizi himself is stunned by the result; he has noted on the paper "Comments unusually welcome".
 
Last edited by a moderator:
Science news on Phys.org
A fascinating paper. Although I am not overly familiar with statistical mechanics, the paper isn't that hard to follow. I suspect his suggestion that Assumption 3 is invalid is correct.

It reminds me of the old axiom we had in the control room of a nuclear power plant when something went wrong "Whatever you just did, undo it."
 
Yes I agree. For those who can't read the paper the three assumptions from which Shalizi derives the reverse arrow of time are:
1. The microcanonical probability function is time-symmetric.
2. Physical evolution proceeds according to Bayesian statistics.
3. The information-theoretic entropy of the microcanonical proibability distribution is equal to the thermodynamic entropy of the macrocanonical system.
 
You know the more I think about it, the more obvious it seems to me that assumption 3 is flawed. The connection between the two entropies seems pretty tenuous at best.
 
Shalizi has now provided a discussion in ordinary language of his result on his blog.
He repeats the preference for statement 3 to go, but it turns out his real target is using Bayesian philosophy in physics. Does water boil because I believe it will? Or because some ideal well informed observer (God?) does? Cutting the link between information entropy and thermodynamic entropy will advance his program to knock this stuff on the head.
 
selfAdjoint said:
Shalizi has now provided a discussion in ordinary language of his result on his blog.
He repeats the preference for statement 3 to go, but it turns out his real target is using Bayesian philosophy in physics. Does water boil because I believe it will? Or because some ideal well informed observer (God?) does? Cutting the link between information entropy and thermodynamic entropy will advance his program to knock this stuff on the head.

At the risk of getting this thread moved to the philosophy section, I'll comment on Bayesian philosophy in physics. In my humble opinion, it is an overinterpretation of what Bayes' Theorem means. Bayes' Theorem allows you to determine the probability of causes. Given an event, and several possible causes, you can determine, using Bayes' Theorem which is the most likely cause. That's it. To read more into it is a mistake.

I see overintrepretation of results as a big problem in physics today, particularly in quantum mechanics.
 
I agree, geometer. Reification of mathematical objects has e'er been the shortcoming of physicists.
 
Frequentism is IMO too restrictive, however. One often wishes to apply probability theory in situations that occur only once. It's a lot simpler IMO to get rid of the idea of "ensembles", and just go with Bayes rule as the necessary and sufficient condition to be able to apply probability theory to a problem.

Thermodynamics is a lot trickier than it looks. There are other flaws with the idea that thermodynamics arises simply from probability theory, IMO, the main one being Poincaire recurrence.

Anyway, that's my $.02.
 
I agree with pervect. In some respects, this treatment strikes me as a backdoor Maxwell's demon. Information is negentropy and increasing the amount of information collected increases negentropy - which superficially appears to reduce the total entropy of the system [i.e., reverses the arrow of time]. The information, however, is not free. Obtaining it imparts enough entropy to the system to offset the negentropy gains. This suggests the problem is not necessarily with the assumptions.
 
Back
Top