PeterDonis said:
Yes, Jaynes explicitly says that Pauli did not prove that entropy must be extensive, he just assumed it and showed what the phenomenology would have to look like if it was
Jaynes, in section 7 of his paper, shows how this factor arises, in appropriate cases (as he notes, entropy is not always perfectly extensive), in a proper analysis that includes the effects of changing ##N##. As he comments earlier in his paper (and as I have referenced him in previous posts), if your analysis only considers the case of constant ##N##, you can't say anything about how entropy varies as ##N## changes. To address the question of extensivity of entropy at all, you have to analyze processes where ##N## changes.
I believe my argument addresses the case where N changes, as it deals with two systems that exchange particles.
Considering a simpler problem may help illustrate my point. There are two bags, one has ##V_1## pockets and the other has ##V_2## pockets. There are ##N## balls, each ball has a different number written on it, so that they are distinguishable. Someone knows at any moment in what poket are each of the balls. You play a game where at each step you chose randomly one of the balls and moves it to one of the pockets, also chosen randomly. After a large number of steps of the game, what would be the most probable number ##N_1## of balls in the bag with ##V_1## pockets?
I guess that most people would agree that simple intuition suggests that ##N_1=N V_1/(V_1+V_2)##. But how to show that mathematically?
There will be ##\Omega_1=V_1^{N_1}## ways to put ##N_1## balls in one bag and ##\Omega_2=V_2^{N_2}## ways to put ##N_2## balls in the other bag.
If you are one of those that believe that there is no logical reason to divide by the factorial, since the balls are distinguishable, I guess you believe that the equilibrium maximizes ##\Omega_1(N_1)×\Omega_2(N_2)##.
If you apply the logarithm and differentiate with ##N_1##, considering that ##N_1+N_2=N## you get ##\ln(V_1)=\ln(V_2)## as equilibrium condition. Since the numbers ##V_1## and ##V_2## are arbitrary, that would suggest that there is no equilibrium. One may believe that this is a paradox. One may insist that the only possible explanation for the existence of an equilibrium is that the balls are in fact indistinguishable and a permutation would not count as a different state. In fact, there is no paradox, only bad math.
Since the balls are distinguishable the number of states for the system composed of the two bags is
##\Omega_1(N_1)×\Omega_2(N_2)×[N!/(N_1! N_2!)]##.
Note that the term ##[N!/(N_1! N_2!)]## is not included to obtain a consistent equilibrium. It is included to correctly count all possible states for the system. I am still waiting for those who clain that there is no logical reason to include of this term to explain how else would they count states...
Including the nessessary ##[N!/(N_1! N_2!)]##, applying the logarithm, and differentiating with ##N_1##, you get ##\ln(V_1/N_1)=\ln(V_2/N_2)##. Of course, you have to remember that ##N_1+N_2=N##, and consider that both ##N_1## and ##N_2## are large enough to use Stirling.
My conclusion, when one uses proper combinatorial logic, one sees that, for a systems of distinguishable elements, equilibrium happens when
##\partial\ln[\Omega_1(N_1)/N_1!]/\partial N_1=\partial\ln[\Omega_2(N_2)/N_2!]/\partial N_2##.
Most people would agree that the definition of entropy is such that
##\partial S_1(N_1)/\partial N_1=\partial S_2(N_2)/\partial N_2##, and that leads to
##S_1=k \ln[\Omega_1(N_1)/N_1!]## and ##S_2=k \ln[\Omega_2(N_2)/N_2!]##.
Note also that this argument is consistent with ##S= k \ln(W)##, only that S is the entropy of the system composed by the two bags, and ##W## is proportional to the number of accessible states for this system composed by the two bags.
This point is stressed by Swendsen, and I quote:
"Although Boltzmann never addressed Gibbs’ Paradox directly, his approach to statistical mechanics provides a solid basis for its resolution. Boltzmann defined the entropy in terms of the probability of the macroscopic state of a composite system. Although the traditional definition of the entropy is often attributed to Boltzmann, this attribution is not correct. The equation on Boltzmann’s tombstone, ##S = k \log W##, which is sometimes called in evidence, was never written by Boltzmann and does not refer to the logarithm of a volume in phase space. The equation was first written down by Max Planck, who correctly attributed the ideas behind it to Boltzmann. Planck also stated explicitly that the symbol “W” stands for the German word “Wahrscheinlichkeit” (which means probability) and refers to the probability of a macroscopic state. The dependence of Boltzmann’s entropy on the number of particles requires the calculation of the probability of the number of distinguishable particles in the each subsystem of a composite system. The calculation of this probability requires the inclusion of the binomial coefficent, ##N!/(N_1!N_2!)##, where ##N_1## and ##N_2## are the numbers of particles in each subsystem and ##N = N_1 + N_2##. This binomial coefficient is the origin of the missing factor of ##1/N!## in the traditional definition, and leads to an expression for the entropy that is extensive."
From
Gibbs’ Paradox and the Definition of Entropy
By
Robert H. Swendsen
Entropy 2008, 10, 15-18