I am teaching myself Stat Mech / thermo from Swendsen's "An Introduction to statistical mechanics and thermodynamics". To this point I find the book interesting and clear to follow. But in section 4.6, Probability and Entropy, he confuses me by concluding that the entropy that he defines there is maximized iff the corresponding probability distribution is maximized. But this conclusion does not (seem to me to) follow from his line of reasoning. Namely: I agree that eq. 4.18 follows from eq. 4.15 and definition 4.16, 4.17. I agree that eq. 4.19 is the max(wrt N_A) height of the binomial distribution, and that it is negligible compared with the other terms in eq. 4.18, and therefore eq 4.20 follows to an excellent approximation. However eq. 4.20 is a very good approximation to 4.18 at any N_A, in fact it is worst at N_A equal to it's equilibrium value; eq. 4.19 gives the error in equating total S to S_A + S_B, and it is maximized at equilibrium. So to conclude that entropy as defined in eq 4.18 is maximized when equilibrium is reached (i.e., when N_A is at it's mean/peak by the binomial distribution), I'm not following. Seems the conclusion is merely that total S_tot to S_A + S_B(to a very good approx which is worst at equil), and that is all... PS: I will add eqs when I master inserting Latex to a post..