Fundamental definition of entropy

  • #1
FeDeX_LaTeX
Gold Member
437
13

Main Question or Discussion Point

Hello;

"[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

Thanks.
 

Answers and Replies

  • #2
SpectraCat
Science Advisor
1,395
1


Hello;

"[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

Thanks.
S is the entropy, with dimensions Energy per unit absolute temperature. The constant of proportionality is the Boltzman constant, k (times a dimensionless constant).

This expression can be useful when evaluating the relative or absolute entropies of systems with countable numbers of microstates (or where the number of microstates can be approximated to some degree of relative accuracy).

It is also incredibly useful pedagogically, because it is the *definition* of entropy (for a simple system where all microstates have the same probability of being populated). When people (students) start talking about fuzzy-headed ideas of entropy as disorder or "heat randomness", you can show them that simple equation and explain simply that no, it is just a measure of the configurational freedom of the system.

A similar, relation holds for more complicated examples:

[tex]S=k\sum_{i}P_i ln P_i[/tex]

This is the general expression for the entropy from statistical thermodynamics, and it is almost as easy to understand as the simpler one. In general, more configurations (or microstates) --> higher entropy in general, although "accessible" configurations (i.e. those that have significant probability of being populated at a given internal energy of the system) are more effective at raising the entropy.

Simple, huh?
 
  • #3
1,015
3


"[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?
If you combine two systems, then the number of microstates multiply
[tex]\Omega_{1+2}=\Omega_1\Omega_2[/tex]
The only reason why there is a logarithm is because people prefer to add things instead of multiplying:
[tex]S_{1+2}=S_1+S_2[/tex]
S is entropy and has a one-to-one relation to the number of microstates.

Here is an example which is not real world practical, but good to explain what microstates mean:
Say you have three distinct urns with 5 equal marbles in total. There are
[tex]\binom{u+m-1}{m}=\binom{3+5-1}{5}=56[/tex] ways
to distribute these over the urns (like 005, 113 and so on). Therefore [itex]S_1=\ln 56[/itex]

Now you have another three urns with 8 marbles in total (you do not know the exact distribution of marbles). The entropy is [itex]S_2=\ln 45[/itex]

If you consider these two urn sets next to each other, they seem to have [itex]S_{1+2}=\ln (56\cdot 45)=\ln 2520[/itex] possible realizations in total.

But if you now really combine the two systems into one so that marbles can interchange between them freely, then the entropy will be
[tex]S_{\text{contact}}=\ln \binom{13+6-1}{13}=\ln 8568[/tex]
since now you have 13 marbles distributed in 6 urns.

As you see, when you bring the systems in contact the entropy increases compared to the entropy when the system are considered together but in isolation.

The second law is plain probability theory.
 
  • #4
160
0


Aside from the good descriptions others have given, you can look at this yet another way. Entropy is a measure of information. The unit of information is the bit. The number of bits required to represent n is equal to the logarithm of n (in base 2).
 

Related Threads on Fundamental definition of entropy

  • Last Post
Replies
8
Views
1K
Replies
1
Views
1K
Replies
4
Views
701
Replies
9
Views
958
Replies
3
Views
810
  • Last Post
Replies
12
Views
3K
  • Last Post
Replies
15
Views
4K
  • Last Post
Replies
15
Views
3K
Replies
4
Views
551
  • Last Post
Replies
2
Views
1K
Top