Fundamental definition of entropy

AI Thread Summary
Entropy, denoted as S, is defined by the equation S ∝ ln(Ω), where Ω represents the number of microstates, with S having dimensions of energy per unit absolute temperature. The Boltzmann constant serves as the proportionality factor, and this formula is essential for understanding the configurational freedom of a system. When combining systems, the total number of microstates multiplies, leading to an increase in entropy, which aligns with the second law of thermodynamics. An example involving urns illustrates how the number of ways to distribute marbles translates to entropy calculations, demonstrating that combining systems increases overall entropy. Ultimately, entropy can also be viewed as a measure of information, where the logarithm reflects the number of bits needed to represent configurations.
FeDeX_LaTeX
Science Advisor
Messages
436
Reaction score
13
Hello;

"S\propto\ln\Omega, where \Omega is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

Thanks.
 
Science news on Phys.org


FeDeX_LaTeX said:
Hello;

"S\propto\ln\Omega, where \Omega is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

Thanks.

S is the entropy, with dimensions Energy per unit absolute temperature. The constant of proportionality is the Boltzman constant, k (times a dimensionless constant).

This expression can be useful when evaluating the relative or absolute entropies of systems with countable numbers of microstates (or where the number of microstates can be approximated to some degree of relative accuracy).

It is also incredibly useful pedagogically, because it is the *definition* of entropy (for a simple system where all microstates have the same probability of being populated). When people (students) start talking about fuzzy-headed ideas of entropy as disorder or "heat randomness", you can show them that simple equation and explain simply that no, it is just a measure of the configurational freedom of the system.

A similar, relation holds for more complicated examples:

S=k\sum_{i}P_i ln P_i

This is the general expression for the entropy from statistical thermodynamics, and it is almost as easy to understand as the simpler one. In general, more configurations (or microstates) --> higher entropy in general, although "accessible" configurations (i.e. those that have significant probability of being populated at a given internal energy of the system) are more effective at raising the entropy.

Simple, huh?
 


FeDeX_LaTeX said:
"S\propto\ln\Omega, where \Omega is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?
If you combine two systems, then the number of microstates multiply
\Omega_{1+2}=\Omega_1\Omega_2
The only reason why there is a logarithm is because people prefer to add things instead of multiplying:
S_{1+2}=S_1+S_2
S is entropy and has a one-to-one relation to the number of microstates.

Here is an example which is not real world practical, but good to explain what microstates mean:
Say you have three distinct urns with 5 equal marbles in total. There are
\binom{u+m-1}{m}=\binom{3+5-1}{5}=56 ways
to distribute these over the urns (like 005, 113 and so on). Therefore S_1=\ln 56

Now you have another three urns with 8 marbles in total (you do not know the exact distribution of marbles). The entropy is S_2=\ln 45

If you consider these two urn sets next to each other, they seem to have S_{1+2}=\ln (56\cdot 45)=\ln 2520 possible realizations in total.

But if you now really combine the two systems into one so that marbles can interchange between them freely, then the entropy will be
S_{\text{contact}}=\ln \binom{13+6-1}{13}=\ln 8568
since now you have 13 marbles distributed in 6 urns.

As you see, when you bring the systems in contact the entropy increases compared to the entropy when the system are considered together but in isolation.

The second law is plain probability theory.
 


Aside from the good descriptions others have given, you can look at this yet another way. Entropy is a measure of information. The unit of information is the bit. The number of bits required to represent n is equal to the logarithm of n (in base 2).
 
Back
Top