Ideal gas configurational entropy -Swendsen 4.6?

  • Context: Graduate 
  • Thread starter Thread starter dslowik
  • Start date Start date
  • Tags Tags
    Entropy Gas Ideal gas
Click For Summary
SUMMARY

The discussion centers on the interpretation of entropy in section 4.6 of Swendsen's "An Introduction to Statistical Mechanics and Thermodynamics." The participant expresses confusion regarding the conclusion that entropy is maximized when the probability distribution is maximized, specifically in the context of equations 4.18, 4.19, and 4.20. They agree with the derivation of these equations but question the reasoning that leads to the assertion that total entropy, S_tot, is maximized at equilibrium. The participant emphasizes that while S_tot can be approximated as the sum of entropies S_A and S_B, this does not imply that the defined entropy in equation 4.18 is maximized at equilibrium.

PREREQUISITES
  • Understanding of statistical mechanics concepts, particularly entropy and probability distributions.
  • Familiarity with binomial distribution and its properties.
  • Knowledge of thermodynamic limits and Stirling's approximation.
  • Ability to interpret mathematical equations in the context of statistical mechanics.
NEXT STEPS
  • Study the derivation and implications of Stirling's approximation in statistical mechanics.
  • Learn about the Gaussian approximation to the binomial distribution for large N.
  • Explore the relationship between probability distributions and entropy maximization in thermodynamics.
  • Review the concepts of extrinsic and intrinsic quantities in statistical mechanics.
USEFUL FOR

This discussion is beneficial for students and educators in statistical mechanics, researchers analyzing thermodynamic systems, and anyone seeking a deeper understanding of entropy and its mathematical foundations in the context of Swendsen's work.

dslowik
Messages
69
Reaction score
0
I am teaching myself Stat Mech / thermo from Swendsen's "An Introduction to statistical mechanics and thermodynamics". To this point I find the book interesting and clear to follow.
But in section 4.6, Probability and Entropy, he confuses me by concluding that the entropy that he defines there is maximized iff the corresponding probability distribution is maximized. But this conclusion does not (seem to me to) follow from his line of reasoning. Namely:

I agree that eq. 4.18 follows from eq. 4.15 and definition 4.16, 4.17.
I agree that eq. 4.19 is the max(wrt N_A) height of the binomial distribution, and that it is negligible compared with the other terms in eq. 4.18, and therefore eq 4.20 follows to an excellent approximation. However eq. 4.20 is a very good approximation to 4.18 at any N_A, in fact it is worst at N_A equal to it's equilibrium value; eq. 4.19 gives the error in equating total S to S_A + S_B, and it is maximized at equilibrium. So to conclude that entropy as defined in eq 4.18 is maximized when equilibrium is reached (i.e., when N_A is at it's mean/peak by the binomial distribution), I'm not following. Seems the conclusion is merely that total S_tot to S_A + S_B(to a very good approx which is worst at equil), and that is all...

PS: I will add eqs when I master inserting Latex to a post..
 
Science news on Phys.org
I have re-written the reasoning contained in Swendsen Sec 4.6 here to my liking :). The source of my initial confusion:confused: is discussed in the last paragraph. Eq. numbers are from Swendsen:

Given a box of volume [itex]V[/itex] partitioned into two parts, A and B, such that [itex]V_A+V_B=V[/itex], and [itex]N=N_A+N_B[/itex] particles free to move within [itex]V[/itex]. Then the probability of having [itex]N_A[/itex] particles in the volume [itex]V_A[/itex] is given by the binomial distribution:

[itex]P(N_A,N_B) =\frac{N!}{N_A!N_B!}\Big(\frac{V_A}{V}\Big)^{N_A}\Big(\frac{V_B}{V}\Big)^{N_B}[/itex].

Taking ln gives: eq. 4.15: [itex]\ \ \ \ln[P(N_A,N_B)] = \ln\Big[\frac{V_A^{N_A}}{N_A!}\Big] + \ln\Big[\frac{V_B^{N_B}}{N_B!}\Big] - \ln\big[\frac{V^{N}}{N!}\big][/itex]

Defining S: eq. 4.16: [itex]S(N,V) \equiv k\ln\Big(\frac{V^{N}}{N!}\Big) + kXN\ \ \[/itex] ([itex]k[/itex] and [itex]X[/itex] are constants.)

These give: eq. 4.18: [itex]S_{tot}(N_A,V_A,N_B,V_B) \equiv S(N_A,V_A) + S(N_B,V_B) = k\ln[P(N_A,N_B)] + S(N,V)[/itex]

Now the equilibrium value of [itex]N_A[/itex] is [itex]\langle N_A\rangle[/itex], which occurs at the maximum of the binomial distribution in the large [itex]N[/itex] limit[itex]{}^*[/itex], i.e., at the maximum of [itex]S_{tot}(N_A,V_A,N_B,V_B)[/itex] since [itex]S(N,V)[/itex] is constant. Thus we have found that equilibrium occurs when [itex]S_{tot}(N_A,V_A,N_B,V_B)[/itex] is maximized.

Further, [itex]S(N,V)[/itex], is to a very good approximation (the thermodynamic limit) an additive(extrinsic) quantity. This follows since the term [itex]\ln[P(N_A,N_B)][/itex] in eq. 4.18 is negligible in comparision to the other terms; even at it's peak we have[itex]{}^{**}[/itex]:
eq. 4.19:[itex]\ \ \ \ln[P(N_A,N_B)]|_{equil}\approx -\frac{1}{2}\ln(2\pi\langle N_A\rangle(V_B/V))[/itex],
while [itex]S(N,V)[/itex] is of order [itex]N[/itex].

However
[itex]S_{tot}(N_A,V_A,N_B,V_B) = S(N_A,V_A) + S(N_B,V_B)\ {\color{red}=}\ S(N,V)[/itex]
is a dangerous equation to write in the sense that [itex]S(N,V)[/itex] is exactly constant, while the variation of [itex]S_{tot}(N_A,V_A,N_B,V_B) = S(N_A,V_A) + S(N_B,V_B)[/itex] contains all the "entropy is maximized in the most probable configuration" physics. Thus we may prefer: [itex]S_{tot}(N_A,V_A,N_B,V_B) = S(N_A,V_A) + S(N_B,V_B) \approx S(N,V)[/itex].[itex]{}^*[/itex] Swendsen shows this follows from Sterling's approximation in Sec 3.12.
[itex]{}^{**}[/itex] This follows from a Gaussian approximation to the binomial distribution for large N as shown in Sec. 3.9.
 
... [itex]S_{tot}(N_A,V_A,N_B,V_B)[/itex] being exactly additive,
[itex]S(N,V)[/itex]being approximately so,
and the relatively small difference
[itex]S_{tot}(N_A,V_A,N_B,V_B) - S(N,V) = k\ln[P(N_A,N_B)][/itex],
being responsible for the result that (the additive) entropy is maximized at equilibrium.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K