Bayesian Stat Mech reverses Entropy?

In summary, Cosma Shalizi has published a brief paper on arxiv discussing the implications of using Bayesian probability in statistical mechanics, specifically in regards to the arrow of time. He argues that using Bayesian statistics leads to a non-increasing entropy, even if the maximum entropy formalism is used. Shalizi also invites comments on his paper and has provided a discussion on his blog. The paper has sparked a debate on the validity of using Bayesian philosophy in physics and the connection between information entropy and thermodynamic entropy. Some argue that this may be an overinterpretation of Bayes' Theorem, while others point out the complexities of applying probability theory in thermodynamic systems. Ultimately, the debate centers around the validity of the assumptions made
  • #1
selfAdjoint
Staff Emeritus
Gold Member
Dearly Missed
6,894
11
Cosma Shalizi has a brief paper in the arxiv:

http://www.arxiv.org/PS_cache/cond-mat/pdf/0410/0410063.pdf

containing a proof that if you use Bayesian degree of belief oriented probability in forming statistical mechanics a la Jaynes, entropy comes out non-increasing; the arrow of time points backwards! Even if you replace full Bayesian statistics with the maximum entropy formalism, the result still holds. It looks like Shalizi himself is stunned by the result; he has noted on the paper "Comments unusually welcome".
 
Last edited by a moderator:
Science news on Phys.org
  • #2
A fascinating paper. Although I am not overly familiar with statistical mechanics, the paper isn't that hard to follow. I suspect his suggestion that Assumption 3 is invalid is correct.

It reminds me of the old axiom we had in the control room of a nuclear power plant when something went wrong "Whatever you just did, undo it."
 
  • #3
Yes I agree. For those who can't read the paper the three assumptions from which Shalizi derives the reverse arrow of time are:
1. The microcanonical probability function is time-symmetric.
2. Physical evolution proceeds according to Bayesian statistics.
3. The information-theoretic entropy of the microcanonical proibability distribution is equal to the thermodynamic entropy of the macrocanonical system.
 
  • #4
You know the more I think about it, the more obvious it seems to me that assumption 3 is flawed. The connection between the two entropies seems pretty tenuous at best.
 
  • #5
Shalizi has now provided a discussion in ordinary language of his result on his blog.
He repeats the preference for statement 3 to go, but it turns out his real target is using Bayesian philosophy in physics. Does water boil because I believe it will? Or because some ideal well informed observer (God?) does? Cutting the link between information entropy and thermodynamic entropy will advance his program to knock this stuff on the head.
 
  • #6
selfAdjoint said:
Shalizi has now provided a discussion in ordinary language of his result on his blog.
He repeats the preference for statement 3 to go, but it turns out his real target is using Bayesian philosophy in physics. Does water boil because I believe it will? Or because some ideal well informed observer (God?) does? Cutting the link between information entropy and thermodynamic entropy will advance his program to knock this stuff on the head.

At the risk of getting this thread moved to the philosophy section, I'll comment on Bayesian philosophy in physics. In my humble opinion, it is an overinterpretation of what Bayes' Theorem means. Bayes' Theorem allows you to determine the probability of causes. Given an event, and several possible causes, you can determine, using Bayes' Theorem which is the most likely cause. That's it. To read more into it is a mistake.

I see overintrepretation of results as a big problem in physics today, particularly in quantum mechanics.
 
  • #7
I agree, geometer. Reification of mathematical objects has e'er been the shortcoming of physicists.
 
  • #8
Frequentism is IMO too restrictive, however. One often wishes to apply probability theory in situations that occur only once. It's a lot simpler IMO to get rid of the idea of "ensembles", and just go with Bayes rule as the necessary and sufficient condition to be able to apply probability theory to a problem.

Thermodynamics is a lot trickier than it looks. There are other flaws with the idea that thermodynamics arises simply from probability theory, IMO, the main one being Poincaire recurrence.

Anyway, that's my $.02.
 
  • #9
I agree with pervect. In some respects, this treatment strikes me as a backdoor Maxwell's demon. Information is negentropy and increasing the amount of information collected increases negentropy - which superficially appears to reduce the total entropy of the system [i.e., reverses the arrow of time]. The information, however, is not free. Obtaining it imparts enough entropy to the system to offset the negentropy gains. This suggests the problem is not necessarily with the assumptions.
 

FAQ: Bayesian Stat Mech reverses Entropy?

1. How does Bayesian Stat Mech reverse entropy?

Bayesian Stat Mech is a statistical method that uses Bayesian inference to analyze the behavior of a system at the molecular level. By considering all possible states of the system and their probabilities, it can predict the most likely state of the system. This allows for the reversal of entropy, as the system is brought back to a more ordered state.

2. Is it possible to reverse entropy using Bayesian Stat Mech?

Yes, it is possible to reverse entropy using Bayesian Stat Mech. Entropy is a probabilistic concept, and Bayesian Stat Mech takes into account all possible states and their probabilities, allowing for the reversal of entropy.

3. How does Bayesian Stat Mech differ from traditional statistical mechanics?

Traditional statistical mechanics uses the principles of classical mechanics to analyze the behavior of a system. It focuses on the macroscopic properties of the system, such as temperature and pressure. On the other hand, Bayesian Stat Mech takes a more microscopic approach, considering all possible states and their probabilities to predict the behavior of the system.

4. Can Bayesian Stat Mech be applied to all systems?

Yes, Bayesian Stat Mech can be applied to all systems, from simple systems such as ideal gases to complex systems such as biological systems. As long as the system can be described in terms of its microscopic states and their probabilities, Bayesian Stat Mech can be used to analyze its behavior.

5. What are the practical applications of Bayesian Stat Mech in the real world?

Bayesian Stat Mech has many practical applications, such as predicting the behavior of complex systems in biology and chemistry, analyzing financial and economic data, and understanding the properties of materials at the molecular level. It can also be used in machine learning and artificial intelligence to make predictions and decisions based on probabilistic models.

Similar threads

Back
Top