Fundamental definition of entropy

Click For Summary

Discussion Overview

The discussion revolves around the fundamental definition of entropy, specifically the relationship expressed as S ∝ ln(Ω), where Ω represents the number of microstates. Participants explore the meaning of S, the origin of the logarithmic relationship, and practical examples of applying this definition in various contexts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions the meaning of S and whether it refers to macrostates, seeking clarification on the logarithmic relationship and practical applications.
  • Another participant explains that S is the entropy, with dimensions of energy per unit absolute temperature, and discusses the Boltzmann constant as the proportionality factor.
  • A different participant elaborates on the multiplication of microstates when combining systems, stating that the logarithm is used because addition is preferred over multiplication in this context.
  • This participant provides an example involving urns and marbles to illustrate how entropy changes when systems are combined, emphasizing the increase in entropy when systems are allowed to interact.
  • Another viewpoint presented is that entropy can also be viewed as a measure of information, linking the concept to bits and the logarithmic representation of information.

Areas of Agreement / Disagreement

Participants express various interpretations of entropy and its implications, with no clear consensus on a singular definition or application. Multiple competing views on the nature of entropy and its mathematical representation remain evident.

Contextual Notes

Some limitations include the dependence on definitions of microstates and macrostates, as well as the unresolved nature of how entropy is perceived in different contexts (e.g., information theory vs. thermodynamics).

FeDeX_LaTeX
Science Advisor
Messages
436
Reaction score
13
Hello;

"[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

Thanks.
 
Science news on Phys.org


FeDeX_LaTeX said:
Hello;

"[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

Thanks.

S is the entropy, with dimensions Energy per unit absolute temperature. The constant of proportionality is the Boltzmann constant, k (times a dimensionless constant).

This expression can be useful when evaluating the relative or absolute entropies of systems with countable numbers of microstates (or where the number of microstates can be approximated to some degree of relative accuracy).

It is also incredibly useful pedagogically, because it is the *definition* of entropy (for a simple system where all microstates have the same probability of being populated). When people (students) start talking about fuzzy-headed ideas of entropy as disorder or "heat randomness", you can show them that simple equation and explain simply that no, it is just a measure of the configurational freedom of the system.

A similar, relation holds for more complicated examples:

[tex]S=k\sum_{i}P_i ln P_i[/tex]

This is the general expression for the entropy from statistical thermodynamics, and it is almost as easy to understand as the simpler one. In general, more configurations (or microstates) --> higher entropy in general, although "accessible" configurations (i.e. those that have significant probability of being populated at a given internal energy of the system) are more effective at raising the entropy.

Simple, huh?
 


FeDeX_LaTeX said:
"[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?
If you combine two systems, then the number of microstates multiply
[tex]\Omega_{1+2}=\Omega_1\Omega_2[/tex]
The only reason why there is a logarithm is because people prefer to add things instead of multiplying:
[tex]S_{1+2}=S_1+S_2[/tex]
S is entropy and has a one-to-one relation to the number of microstates.

Here is an example which is not real world practical, but good to explain what microstates mean:
Say you have three distinct urns with 5 equal marbles in total. There are
[tex]\binom{u+m-1}{m}=\binom{3+5-1}{5}=56[/tex] ways
to distribute these over the urns (like 005, 113 and so on). Therefore [itex]S_1=\ln 56[/itex]

Now you have another three urns with 8 marbles in total (you do not know the exact distribution of marbles). The entropy is [itex]S_2=\ln 45[/itex]

If you consider these two urn sets next to each other, they seem to have [itex]S_{1+2}=\ln (56\cdot 45)=\ln 2520[/itex] possible realizations in total.

But if you now really combine the two systems into one so that marbles can interchange between them freely, then the entropy will be
[tex]S_{\text{contact}}=\ln \binom{13+6-1}{13}=\ln 8568[/tex]
since now you have 13 marbles distributed in 6 urns.

As you see, when you bring the systems in contact the entropy increases compared to the entropy when the system are considered together but in isolation.

The second law is plain probability theory.
 


Aside from the good descriptions others have given, you can look at this yet another way. Entropy is a measure of information. The unit of information is the bit. The number of bits required to represent n is equal to the logarithm of n (in base 2).
 

Similar threads

  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 18 ·
Replies
18
Views
6K
  • · Replies 14 ·
Replies
14
Views
7K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K