1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Fundamental definition of entropy

  1. Mar 15, 2010 #1

    FeDeX_LaTeX

    User Avatar
    Gold Member

    Hello;

    "[itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates" is what a user told me was the fundamental definition of entropy. What is S? Is it the number of macrostates? And where does the ln come from? Can I see an example of where this formula would be used in practice?

    Thanks.
     
  2. jcsd
  3. Mar 15, 2010 #2

    SpectraCat

    User Avatar
    Science Advisor

    Re: Entropy

    S is the entropy, with dimensions Energy per unit absolute temperature. The constant of proportionality is the Boltzman constant, k (times a dimensionless constant).

    This expression can be useful when evaluating the relative or absolute entropies of systems with countable numbers of microstates (or where the number of microstates can be approximated to some degree of relative accuracy).

    It is also incredibly useful pedagogically, because it is the *definition* of entropy (for a simple system where all microstates have the same probability of being populated). When people (students) start talking about fuzzy-headed ideas of entropy as disorder or "heat randomness", you can show them that simple equation and explain simply that no, it is just a measure of the configurational freedom of the system.

    A similar, relation holds for more complicated examples:

    [tex]S=k\sum_{i}P_i ln P_i[/tex]

    This is the general expression for the entropy from statistical thermodynamics, and it is almost as easy to understand as the simpler one. In general, more configurations (or microstates) --> higher entropy in general, although "accessible" configurations (i.e. those that have significant probability of being populated at a given internal energy of the system) are more effective at raising the entropy.

    Simple, huh?
     
  4. Mar 15, 2010 #3
    Re: Entropy

    If you combine two systems, then the number of microstates multiply
    [tex]\Omega_{1+2}=\Omega_1\Omega_2[/tex]
    The only reason why there is a logarithm is because people prefer to add things instead of multiplying:
    [tex]S_{1+2}=S_1+S_2[/tex]
    S is entropy and has a one-to-one relation to the number of microstates.

    Here is an example which is not real world practical, but good to explain what microstates mean:
    Say you have three distinct urns with 5 equal marbles in total. There are
    [tex]\binom{u+m-1}{m}=\binom{3+5-1}{5}=56[/tex] ways
    to distribute these over the urns (like 005, 113 and so on). Therefore [itex]S_1=\ln 56[/itex]

    Now you have another three urns with 8 marbles in total (you do not know the exact distribution of marbles). The entropy is [itex]S_2=\ln 45[/itex]

    If you consider these two urn sets next to each other, they seem to have [itex]S_{1+2}=\ln (56\cdot 45)=\ln 2520[/itex] possible realizations in total.

    But if you now really combine the two systems into one so that marbles can interchange between them freely, then the entropy will be
    [tex]S_{\text{contact}}=\ln \binom{13+6-1}{13}=\ln 8568[/tex]
    since now you have 13 marbles distributed in 6 urns.

    As you see, when you bring the systems in contact the entropy increases compared to the entropy when the system are considered together but in isolation.

    The second law is plain probability theory.
     
  5. Mar 15, 2010 #4
    Re: Entropy

    Aside from the good descriptions others have given, you can look at this yet another way. Entropy is a measure of information. The unit of information is the bit. The number of bits required to represent n is equal to the logarithm of n (in base 2).
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook