Understanding Entropy: Definition, Relationship to S=Q/T, and Impact of dS=dQ/T

  • Thread starter Thread starter BoyangQin
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
The discussion centers on the definition of entropy, specifically the concept of "states available" to a molecule, represented by Ω in the equation S=K*ln(Ω). This refers to the number of distinct microstates that a system can occupy, which increases with changes like gas expansion or heat addition, leading to higher entropy. The relationship between this statistical definition and the thermodynamic definition S=Q/T is explored, noting that dS = dQ/T applies only in quasistatic processes. The conversation also touches on the challenges of reconciling the mathematical implications of logarithmic functions with the concept of infinity in probability distributions. Overall, the complexities of entropy calculations and interpretations in statistical mechanics are highlighted.
BoyangQin
Messages
12
Reaction score
0
A line about entropy on wikipedia says:

"The entropy of a molecule is defined as S=K*ln(Ω) , where Ω is the number of states available to the molecule."

what does that "states available" mean?and how does it related to other definition of S, S= Q/T?

BTW,
Does the equation dS = dQ/T mean that if a infinite small Q was passed into a system from the surrounding, then the entropy of the system MUST rise by dQ/t ?
 
Science news on Phys.org
First, recall that dS = dQ/T is only valid for a quasistatic process. That said, Wikipedia's page "Entropy (statistical thermodynamics)" gives a derivation in the box on the right side of the page, for a canonical system. It uses the first law to establish the result.

http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics )

Note that the formula used for that derivation,

S = -k_B\sum p_i \ln p_i

reduces to S = k_B\ln \Omega when the system is microcanonical, as then p_i = 1/\Omega.
 
Last edited by a moderator:
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.
 
mikelepore said:
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.

There's a problem related to logs of probability densities. P(x)dx is a probability and so it is unitless, so P(x) has units, and it's weird taking logs of quantities with units. One way to get round that is to say that only entropy differences make sense, then you get log of ratio of P(x)/Q(x) and the units disaappear. More generally, the mutual information is invariant under an invertible change of variables. Or maybe insist that quantum mechanics is true and things are really discrete.
 
Back
Top