- #1

SchroedingersLion

- 214

- 57

I have a fundamental question to the first part of Swendsen's Intro to StatMech and Thermodynamics (book).

Suppose we have two isolated systems of volumes ##V_1## and ##V_2##. We distribute ##N## ideal gas particles across the two systems with total energy ##E##.

Suppose we bring the systems into contact so that they can exchange particles.

Now, following Boltzmann's original train of thought to relate the probabilities of individual microstates with the thermodynamic quantitiy called entropy, we make the following assumptions:

1) The macroscopic state of an isolated system goes from less probable to more probable states.

2) The most probable state is called equilibrium, a state where all macroscopic system descriptions are stationary.

3) Everything we do not know is equally likely.

From 3), it follows that each position within the two volumes is equally likely to be occupied by the particles. From this, one can derive a probability density ##P(N_1|N)## to find ##N_1## of the ##N## particles in ##V_1##. It is a binomial distribution with well-defined maximum. The author defines the equilibrium value for ##N_1## to be the maximizer of ##P(N_1|N)##, i.e.

$$

N_{1}^{eq}=\text{argmax}P(N_1|N).

$$

The author makes it very clear that the actual particle number ##N_1## keeps fluctuating in time. But close to the thermodynamic limit, the distribution ##P(N_1|N)## is sharply peaked, so that ##N_{1}^{eq}=\text{argmax}P(N_1|N)=<N_1>## with negligible variance. That means in equilibrium, a macroscopic measurement (= one that cannot resolve the small variance, i.e. the statistical fluctuations in ##N_1##) will always give the same result ##N_{1}^{eq}##.

The author gives Boltzmann's (configurational) entropy definition for the subsystems:

$$

S(N_i,V_i) = \text{ln}\left(P(N_i|N)\right), \text{for } i\in \left\{1,1\right\},

$$

The total configurational entropy can then be written as

$$

S(N,V) = S(N_1,V_1) + S(N_2,V_2).

$$

Then it follows that the equilibrium value ##N_{1}^{eq}## maximizes entropy ##S(N_1,V_1)##.

I am unsure about the entropy of a constrained state where we force ##N_1 \neq N_{1}^{eq}## particles to be within ##V_1##. Assuming the connection to ##V_2## is closed (adiabatic barrier).

**Question 1)**

If I want to have the entropy of this state, would I just insert this number ##N_1## into ##S(N_1,V_1) = \text{ln}\left(P(N_1|N)\right))## from before? Then I would have a state of low entropy, and based on the three assumptions from above, upon the release of the constraint the entropy will be maximized. But I don't see why I would be allowed to use function ##P(N_1|N) ##, as it was derived under the assumption that there is no barrier.

**Question 2)**

Since the constraint fixes the particle number and there is no macroscopic change anymore... does that not mean that the constrained state is also an equilibrium state? The difference to the unconstrained equilibrium state would then be that there are no more microscopic fluctuations...

The distribution ##P(N_1|N)## is a delta peak now. So what is the logarithm of a delta peak? You see how I am getting confused ^^"

Best,

SL