Meaning of formula from statistical physics

Click For Summary

Discussion Overview

The discussion revolves around the interpretation of the entropy formula from statistical physics, specifically S = -k∑_r{p_r ln p_r}. Participants explore the meaning of the components of the formula, particularly the relationship between entropy and probabilities of microscopic states.

Discussion Character

  • Exploratory, Technical explanation, Conceptual clarification

Main Points Raised

  • One participant seeks clarification on the meaning of the entropy formula, acknowledging that S represents entropy and p represents probabilities.
  • Another participant explains that the formula indicates a relationship between macroscopic equilibrium states and microscopic configurations, suggesting that entropy increases with molecular randomness.
  • A third participant references a textbook definition of entropy, noting that the formula can be viewed as a probability mass function where the probabilities should sum to one.

Areas of Agreement / Disagreement

Participants express varying levels of understanding and interpretation of the formula, with no consensus reached on a singular explanation. Some viewpoints align with textbook definitions, while others introduce nuances regarding the relationship between entropy and molecular states.

Contextual Notes

There is a lack of clarity regarding the assumptions behind the formula and how it applies to different scenarios. The discussion does not resolve the specific interpretations of the probabilities involved.

broegger
Messages
257
Reaction score
0
Hi.

Can anyone explain the meaning of this formula from statistical physics to me:

[tex]S = -k\sum_r{p_r\ln p_r}[/tex]​

Ok, I know that S is the entropy, the p's are probabilities of some sort - but somehow this is not satisfactory :-)
 
Science news on Phys.org
What this formula tells you is that for each state of macroscopic equilibrium there corresponds a large number of possible microscopic states or molecular configurations. The entropy, s, of a system is related to the total number of possible microscopic states of that system, called the thermodynamic probability p, by the Boltzmann relation:

[tex]S= k*ln(p)[/tex]

So from a microscopic point of view, the entropy of a system increases when the molecular randomness or uncertainty increases.

(stole most of that from my textbook but, eh, gets the job done :-p)
Edit: Your formula is slightly different from mine. It appears that yours takes on the form of a probability mass function, where the sum over all r should equal to one.
 
Last edited:
Ok, thanks guys!
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
6
Views
2K
Replies
4
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K