A line about entropy on wikipedia says:

"The entropy of a molecule is defined as S=K*ln(Ω) , where Ω is the number of states available to the molecule."

what does that "states available" mean?and how does it related to other definition of S, S= Q/T?

BTW,
Does the equation dS = dQ/T mean that if a infinite small Q was passed into a system from the surrounding, then the entropy of the system MUST rise by dQ/t ?

Mute
Homework Helper
First, recall that dS = dQ/T is only valid for a quasistatic process. That said, Wikipedia's page "Entropy (statistical thermodynamics)" gives a derivation in the box on the right side of the page, for a canonical system. It uses the first law to establish the result.

http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics [Broken])

Note that the formula used for that derivation,

$$S = -k_B\sum p_i \ln p_i$$

reduces to $S = k_B\ln \Omega$ when the system is microcanonical, as then $p_i = 1/\Omega$.

Last edited by a moderator:
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.

atyy