Understanding Entropy: Definition, Relationship to S=Q/T, and Impact of dS=dQ/T

  • Context: Graduate 
  • Thread starter Thread starter BoyangQin
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary

Discussion Overview

The discussion revolves around the concept of entropy, specifically its definition, the relationship between different equations involving entropy (S=Q/T and S=K*ln(Ω)), and the implications of the equation dS=dQ/T. Participants explore the meaning of "states available" in the context of entropy and raise questions about the mathematical interpretations of these concepts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions the meaning of "states available" in the context of the entropy definition S=K*ln(Ω) and its relation to S=Q/T.
  • Another participant notes that dS=dQ/T is valid only for quasistatic processes and references a Wikipedia page for a derivation related to canonical systems.
  • Examples from textbooks are provided to illustrate how the increase in volume during free expansion leads to more available states for gas particles, thus increasing entropy.
  • Concerns are raised about reconciling the mathematical form of entropy with the concept of infinite states, particularly regarding the logarithmic function and its application to continuous distributions.
  • A participant discusses the challenges of taking logarithms of probability densities and suggests that only entropy differences are meaningful, which could help resolve issues with units in the logarithmic expressions.
  • There is mention of the implications of quantum mechanics on the discreteness of states, which could affect the interpretation of entropy.

Areas of Agreement / Disagreement

Participants express various interpretations and concerns regarding the definitions and implications of entropy, indicating that multiple competing views remain without a clear consensus on the issues discussed.

Contextual Notes

Participants highlight limitations in understanding the mathematical treatment of entropy, particularly regarding the treatment of infinite states and the units involved in logarithmic calculations. The discussion remains open-ended with unresolved questions about the nature of entropy and its mathematical representation.

BoyangQin
Messages
12
Reaction score
0
A line about entropy on wikipedia says:

"The entropy of a molecule is defined as S=K*ln(Ω) , where Ω is the number of states available to the molecule."

what does that "states available" mean?and how does it related to other definition of S, S= Q/T?

BTW,
Does the equation dS = dQ/T mean that if a infinite small Q was passed into a system from the surrounding, then the entropy of the system MUST rise by dQ/t ?
 
Science news on Phys.org
First, recall that dS = dQ/T is only valid for a quasistatic process. That said, Wikipedia's page "Entropy (statistical thermodynamics)" gives a derivation in the box on the right side of the page, for a canonical system. It uses the first law to establish the result.

http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics )

Note that the formula used for that derivation,

S = -k_B\sum p_i \ln p_i

reduces to S = k_B\ln \Omega when the system is microcanonical, as then p_i = 1/\Omega.
 
Last edited by a moderator:
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.
 
mikelepore said:
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.

There's a problem related to logs of probability densities. P(x)dx is a probability and so it is unitless, so P(x) has units, and it's weird taking logs of quantities with units. One way to get round that is to say that only entropy differences make sense, then you get log of ratio of P(x)/Q(x) and the units disaappear. More generally, the mutual information is invariant under an invertible change of variables. Or maybe insist that quantum mechanics is true and things are really discrete.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 21 ·
Replies
21
Views
5K
Replies
14
Views
1K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K