Equilibrium, entropy, probability - Release of constraints

  • #1
Hi everyone,

I have a fundamental question to the first part of Swendsen's Intro to StatMech and Thermodynamics (book).

Suppose we have two isolated systems of volumes ##V_1## and ##V_2##. We distribute ##N## ideal gas particles across the two systems with total energy ##E##.
Suppose we bring the systems into contact so that they can exchange particles.

Now, following Boltzmann's original train of thought to relate the probabilities of individual microstates with the thermodynamic quantitiy called entropy, we make the following assumptions:
1) The macroscopic state of an isolated system goes from less probable to more probable states.
2) The most probable state is called equilibrium, a state where all macroscopic system descriptions are stationary.
3) Everything we do not know is equally likely.

From 3), it follows that each position within the two volumes is equally likely to be occupied by the particles. From this, one can derive a probability density ##P(N_1|N)## to find ##N_1## of the ##N## particles in ##V_1##. It is a binomial distribution with well-defined maximum. The author defines the equilibrium value for ##N_1## to be the maximizer of ##P(N_1|N)##, i.e.
$$
N_{1}^{eq}=\text{argmax}P(N_1|N).
$$

The author makes it very clear that the actual particle number ##N_1## keeps fluctuating in time. But close to the thermodynamic limit, the distribution ##P(N_1|N)## is sharply peaked, so that ##N_{1}^{eq}=\text{argmax}P(N_1|N)=<N_1>## with negligible variance. That means in equilibrium, a macroscopic measurement (= one that cannot resolve the small variance, i.e. the statistical fluctuations in ##N_1##) will always give the same result ##N_{1}^{eq}##.

The author gives Boltzmann's (configurational) entropy definition for the subsystems:
$$
S(N_i,V_i) = \text{ln}\left(P(N_i|N)\right), \text{for } i\in \left\{1,1\right\},
$$
The total configurational entropy can then be written as
$$
S(N,V) = S(N_1,V_1) + S(N_2,V_2).
$$

Then it follows that the equilibrium value ##N_{1}^{eq}## maximizes entropy ##S(N_1,V_1)##.

I am unsure about the entropy of a constrained state where we force ##N_1 \neq N_{1}^{eq}## particles to be within ##V_1##. Assuming the connection to ##V_2## is closed (adiabatic barrier).
Question 1)
If I want to have the entropy of this state, would I just insert this number ##N_1## into ##S(N_1,V_1) = \text{ln}\left(P(N_1|N)\right))## from before? Then I would have a state of low entropy, and based on the three assumptions from above, upon the release of the constraint the entropy will be maximized. But I don't see why I would be allowed to use function ##P(N_1|N) ##, as it was derived under the assumption that there is no barrier.
Question 2)
Since the constraint fixes the particle number and there is no macroscopic change anymore... does that not mean that the constrained state is also an equilibrium state? The difference to the unconstrained equilibrium state would then be that there are no more microscopic fluctuations...
The distribution ##P(N_1|N)## is a delta peak now. So what is the logarithm of a delta peak? You see how I am getting confused ^^"

Best,
SL
 

Answers and Replies

  • #2
Suppose we bring the systems into contact so that they can exchange particles.......

I am unsure about the entropy of a constrained state where we force ##N_1 \neq N_{1}^{eq}## particles to be within ##V_1##. Assuming the connection to ##V_2## is closed (adiabatic barrier).
If there is an exchange of particles, there can be no adiabatic barrier. Please explain better this part of your doubts.

Question 1)
If I want to have the entropy of this state, would I just insert this number ##N_1## into ##S(N_1,V_1) = \text{ln}\left(P(N_1|N)\right))## from before? Then I would have a state of low entropy, and based on the three assumptions from above, upon the release of the constraint the entropy will be maximized. But I don't see why I would be allowed to use function ##P(N_1|N) ##, as it was derived under the assumption that there is no barrier.
Note that the total number of particles is ##N = N_1 + N_2## and you don't know how they were originally distributed in the two volumes ##V_1## and ##V_2##
but in the book
##N_1## is the number of particles that remain in the volume ##V_1## when you let it take freely is the total volume ##V = V_1 + V_2##, this value is independent of the content of the volume ##V_2##
The contribution to the entropy of that part of the content only in ##V_1## system is ##S (N_1, V_1) = \text {ln} \left (P (N_1 | N) \right))##
this probability is equal to that of finding ##N-N_1## particles in volume ##V_2##
In the same way, releasing particles from ##V_2## towards ##V_1## has probability
##S (N_2, V_2) = \text {ln} \left (P (N_2 | N) \right))## which will be the same probability that ##N-N_2## particles have passed towards ##V_1## regardless of what ##V_1## contains

The entropy in each container will be the sum ##S (N_1, V_1) + S (N_2, V_2) = S (N, V)##
like is logic. Because the probability that a volume has a certain number of particles is the same probability that the other volume has a complementary number of particles, and "not" the sum of both probabilities.
You are understanding the meaning of ##N## in the book differently from what you propose in the problem.
Question 2)
Since the constraint fixes the particle number and there is no macroscopic change anymore... does that not mean that the constrained state is also an equilibrium state? The difference to the unconstrained equilibrium state would then be that there are no more microscopic fluctuations...
The distribution ##P(N_1|N)## is a delta peak now. So what is the logarithm of a delta peak? You see how I am getting confused ^^"

I do not understand what you mean by restriction, at the beginning the volumes are separated and then the volumes are joined as a single larger volume
##\text{ln}\left(P(N_1|N)\right))## from before? Then I would have a state of low entropy, and based on the three assumptions from above, upon the release of the constraint the entropy will be maximized. But I don't see why I would be allowed to use function ##P(N_1|N) ##, as it was derived under the assumption that there is no barrier.
Question 2)
Since the constraint fixes the particle number and there is no macroscopic change anymore... does that not mean that the constrained state is also an equilibrium state? The difference to the unconstrained equilibrium state would then be that there are no more microscopic fluctuations...
The distribution ##P(N_1|N)## is a delta peak now. So what is the logarithm of a delta peak? You see how I am getting confused ^^"

Best,
SL


I do not understand what you mean by restriction, at the beginning the volumes are separated and then the volumes are joined as a single larger volume
 
  • #3
Thanks for the response.

I just don't understand how the entropy of the constrained system is connected to the unconstrained system.

I have ##N## total particles and a total volume ##V## that is split between two subvolumina, ##V=V_1+V_2##, where ##N_1## particles are in ##V_1##, and ##N_2## particles are in ##V_2##, with ##N=N_1 + N_2##.

Two cases:
1) There is no barrier, particles can distribute themselves freely. In this case, the probability ##P(N_1|N)## is maximized at the equilibrium value of ##N_1##. Thus, the fucntion ##S(N_1) := ln(P(N_1|N)## is also maximized. Note that in deriving the function ##S(N_1)##, the author relied on the fact that there is no barrier.
That means in any other scenario, I would not be allowed to use the derived function ##S(N_1)##, because the probability density ##P## would look different.

2) There is a barrier, particles are stuck in either ##V_1## or ##V_2##. Now, if ##N_1## is not the equilibrium value, what is the entropy of this state? Do I still insert the value ##N_1## into the function ##S(N_1)## from before? That does not make sense in my eyes, as this function was derived with the assumption that particles can move freely... The probability ##P(N_1|N)## will have a different shape in this case, so the entropy function will look differently as well.
 

Related Threads on Equilibrium, entropy, probability - Release of constraints

  • Last Post
Replies
3
Views
9K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
17
Views
4K
Replies
5
Views
654
Replies
2
Views
383
Replies
3
Views
10K
Replies
4
Views
959
Replies
25
Views
7K
  • Last Post
Replies
4
Views
3K
J
Top