I Gibbs paradox: an urban legend in statistical physics

Click For Summary
The discussion centers on the assertion that the Gibbs paradox in classical statistical physics is a misconception, with claims that no real paradox exists regarding the mixing of distinguishable particles. Participants reference key works by Van Kampen and Jaynes, arguing that Gibbs's original conclusions about indistinguishability were flawed and that the N! term in entropy calculations is logically necessary rather than paradoxical. The conversation highlights the distinction between classical and quantum mechanics, emphasizing that classical mechanics treats particles as distinguishable, which should yield measurable mixing entropy. Some contributors challenge the notion that the paradox reveals inconsistencies in classical models, suggesting that it is an urban legend rather than a fundamental issue. The debate underscores the ongoing confusion and differing interpretations surrounding the implications of the Gibbs paradox in both classical and quantum contexts.
  • #91
autoUFC said:
I really feel I am being trolled as people keep disagreing with me, but without addressing my arguments.

I see no indication that anyone in this thread is trolling.

autoUFC said:
You say there is no classical reason to count this way. What other way to count exists?

A way that takes into account what happens if ##N## changes. If you only consider processes in which ##N## is constant, which is all you have considered in your posts, you cannot say anything about the extensivity of entropy. To even address that question at all, you need to consider processes in which ##N## changes. That is the key point Jaynes makes in the passage from his paper that I quoted in my previous post.

autoUFC said:
One does not find in Jaynes the term ##[N! /( N_1! N_2!)]## that I see as the key to the problem.

Jaynes in Section 7 of his paper is discussing a general treatment of extensivity (how entropy varies with ##N##), not the particular case of two types of distinguishable particles that you are considering. His general analysis applies to your case and for that case it will give the term you are interested in.
 
  • Informative
Likes hutchphd
Physics news on Phys.org
  • #92
PeterDonis said:
If I am understanding Jaynes' argument correctly, he is arguing that you can justify the ##1 / N!## factor in classical physics using a phenomenological argument. At the bottom of p. 2 of his paper, he says:
He then references Pauli (1973) for a correct phenomenological analysis of extensivity, which would then apply to both the classical and quantum cases. He discusses this analysis in more detail in Section 7 of his paper.
I checked Pauli vol. III. There is only the treatment of ideal-gas mixtures within phenomenological thermodynamics, and there you start from the correct extensive entropy. After the usual derivation of the entropy change when letting two different gases diffuse into each other,
$$\bar{S}-S=R [n_1 \ln(V/V_1)+n_2 \ln(V/V_2)] > 0,$$
he states:

"The increase in entropy, ##\bar{S}-S##, is independent of the nature of the two gases. They must simply be different. If both gases are the same, then the change in entropy is zero; that is
$$\bar{S}-S=0.$$
We see, therefore, that there is no continuous transition between two gases. The increase in entropy is always finite, even if the gases are only infinitesimally different. However, if the two gases are the same, then the change in entropy is zero. Therefore, it is not allowed to let the difference between two gases gradually vanish. (This is important in quantum theory.)"

The issue with the counting in statistical mechanics is not discussed, but I think the statement is very clear that there is mixing energy for different gases and (almost trivially) none for identical gases. I still don't see, where in all this is a justification in classical statistics for the factor ##1/N!## in the counting of "complexions" other than the phenomenological input that the entropy must be extensive. From a microscopic point of view there is no justification other than the indistinguishability due to quantum theory. I think you need quantum theory to justify the factor ##1/N!## in the counting of complexions and that you also need quantum theory to get a well-defined entropy, because you need ##h=2 \pi \hbar## as the phase-space-volume measure to count phase-space cells. Then you also don't get the horrible expressions for the classical entropy with dimensionful arguments of logarithms. That's all I'm saying.
 
  • #93
vanhees71 said:
I checked Pauli vol. III. There is only the treatment of ideal-gas mixtures within phenomenological thermodynamics, and there you start from the correct extensive entropy.

Yes, Jaynes explicitly says that Pauli did not prove that entropy must be extensive, he just assumed it and showed what the phenomenology would have to look like if it was true.

vanhees71 said:
there is no continuous transition between two gases. The increase in entropy is always finite, even if the gases are only infinitesimally different

What does "infinitesimally different" mean? There is no continuous parameter that takes one gas into the other. You just have two discrete cases: one type of gas particle, or two types of gas particle mixed together.

Jaynes' treatment seems to me to correctly address this as well. Let me try to summarize his argument:

(1) In the case where we have just one type of gas particle, removing the barrier between the two halves of the container does not change the macrostate at all. We still have one type of gas particle, spread through the entire container. So macroscopically, the process of removing the barrier is easily reversible: just reinsert the barrier. It is true that particles that were confined to the left half of the container before are now allowed to be in the right half, and vice versa, and reinserting the barrier does not put all of the particles that were confined to each half back where they were originally; the precise set of particles that are confined to each half will be different after the barrier is reinserted, as compared to before it was removed. But since all of the particles are of the same type, we have no way of distinguishing the state before the barrier was removed from the state after the barrier was reinserted, so there is no change in entropy.

(2) In the case where we have two types of gas particle, removing the barrier does change the macrostate; now we have to allow for particles of both types being in both halves of the container, instead of each type being confined to just one half. This is reflected in the fact that the process of removing the barrier is now not reversible: we can't just reinsert the barrier and get back the original macrostate. To get back the original macrostate, we would have to pick out all the particles that were in the "wrong" half of the container and move them back to where they were before the barrier was removed. The mixing entropy ##N k \ln 2## is a measure of how much information is required to perform that operation, which will require some external source of energy and will end up increasing the entropy of that external source by at least that much (for example by forcing heat to flow from a hot reservoir to a cold reservoir and decreasing the temperature difference between them).

There is no continuum between these two alternatives; they are discrete. Alternative 2 obtains if there is any way of distinguishing the two types of gas particle available to us. It doesn't depend on any notion of "how different" they are.

vanhees71 said:
I still don't see, where in all this is a justification in classical statistics for the factor ##1 / N!## in the counting of "complexions"

Jaynes, in section 7 of his paper, shows how this factor arises, in appropriate cases (as he notes, entropy is not always perfectly extensive), in a proper analysis that includes the effects of changing ##N##. As he comments earlier in his paper (and as I have referenced him in previous posts), if your analysis only considers the case of constant ##N##, you can't say anything about how entropy varies as ##N## changes. To address the question of extensivity of entropy at all, you have to analyze processes where ##N## changes.

vanhees71 said:
you need ##h = 2 \pi \hbar## as the phase-space-volume measure to count phase-space cells. Then you also don't get the horrible expressions for the classical entropy with dimensionful arguments of logarithms.

Jaynes addresses this as well: he notes that, in the phenomenological analysis, a factor arises which has dimensions of action, but there is no explanation for where it comes from. I agree you need quantum mechanics to explain where this factor comes from.
 
  • Like
Likes dextercioby, hutchphd and vanhees71
  • #94
PeterDonis said:
What does "infinitesimally different" mean? There is no continuous parameter that takes one gas into the other. You just have two discrete cases: one type of gas particle, or two types of gas particle mixed together.
Don't ask me. That's what Pauli wrote. Of course, there's no continuous parameter, but that's also not explainable within classical mechanics. There's no consistent classical model of matter. It's not by chance that quantum theory has been discovered because of the inconsistencies of classical statistical physics. It all started with Planck's black-body radiation law. Classical statistics of the em. field leads to the Rayleigh-Jeans catastrophe. So there was no other way out than Planck's "act of desparation". Other examples are the specific heats at low temperature, the impossibility to derive Nernst's 3rd Law from classical statistics, and last but not least also the here discussed Gibbs paradox.
Jaynes' treatment seems to me to correctly address this as well. Let me try to summarize his argument:
Sure, I fully agree with Jayne's treatment, and of course there's no paradox if you use the information-theoretical approach and the indistinguishability of identical particles from quantum theory.

I just don't think that there is any classical justification for the crucial factor ##1/N!##, because classical mechanics tells you that you can follow any individual particle's trajectory in phase space. That this is practically impossible is not an argument for this factor, because you have to count different microstates compatible with the macrostate under consideration.

It's anyway a useless discussion, because today we know the solution of all these quibbles: It's QT!
 
  • Like
Likes hutchphd
  • #95
vanhees71 said:
classical mechanics tells you that you can follow any individual particle's trajectory in phase space. That this is practically impossible is not an argument for this factor, because you have to count different microstates compatible with the macrostate under consideration.

This would imply that in classical mechanics there can never be any such thing as a gas with only one type of particle, or even a gas with just two types: a gas containing ##N## particles would have to have ##N## types of particles, since each individual particle must be its own type: each individual particle is in principle distinguishable from every other one, and microstates must be counted accordingly.

If this is the case, then there can't be any Gibbs paradox in classical mechanics because you can't even formulate the scenario on which it is based.
 
  • Like
Likes dextercioby, hutchphd and vanhees71
  • #96
I have a question related to this topic. Since the canonical ensemble corresponds to an open system, and the micro-canonical ensemble corresponds to a closed system, is it true that the idea of density matrix is the appropriate way to describe canonical ensembles and not micro-canonical ensembles?
 
  • #97
PeterDonis said:
If this is the case, then there can't be any Gibbs paradox in classical mechanics because you can't even formulate the scenario on which it is based.
Not sure about that.
If entropy is only a function of macroscopic variables then you don't even take into account which particle is which. So it shouldn't matter whether they are distinguishable or not.
Even for distinguishable particles you would conclude that mixing two identical gases will not increase entropy, wouldn't you?
 
  • Like
Likes vanhees71
  • #98
PeterDonis said:
This would imply that in classical mechanics there can never be any such thing as a gas with only one type of particle, or even a gas with just two types: a gas containing ##N## particles would have to have ##N## types of particles, since each individual particle must be its own type: each individual particle is in principle distinguishable from every other one, and microstates must be counted accordingly.

If this is the case, then there can't be any Gibbs paradox in classical mechanics because you can't even formulate the scenario on which it is based.
Well, yes. That's an extreme formulation, but I'd say it's correct. It shows only once more that classical mechanics is not the correct description of matter, because it contradicts the observations. You need quantum theory!
 
  • #99
love_42 said:
I have a question related to this topic. Since the canonical ensemble corresponds to an open system, and the micro-canonical ensemble corresponds to a closed system, is it true that the idea of density matrix is the appropriate way to describe canonical ensembles and not micro-canonical ensembles?
All quantum states are described by statistical operators, aka a density matrix.

The difference between the standard ensembles is, under which constraints you maximize the Shannon-Jaynes-von Neumann entropy to find the equilibrium statistical operator:

Microcanonical: You have a closed system and now only the values of the conserved quantities (the 10 from spacetime symmetry, energy, momentum, angular momentum, center-mass/momentum velocity and maybe some for conserved charges like baryon number, Strangeness, electric charge).

Canonical: The considered system is part of a larger system but you can exchange (heat) energy between the systems. The energy of the subsystem is known only on average.

Grand-Canonical: The considered system is part of a larger system but you can exchange (heat) energy and particles between the systems. The energy and conserved charges of the subsystem are known only on average.
 
  • Like
Likes dextercioby
  • #100
Philip Koeck said:
Not sure about that.
If entropy is only a function of macroscopic variables then you don't even take into account which particle is which. So it shouldn't matter whether they are distinguishable or not.
Even for distinguishable particles you would conclude that mixing two identical gases will not increase entropy, wouldn't you?
True, but this is the phenomenological approach, and you don't use statistical physics to derive the phenomenological thermodynamic quantities to begin with, and no problem with counting occurs. That's the strength of phenomenological thermodynamics: You take a view simple axioms based on observations and describe very many phenomena very well. That's why the physicists of the 18th and 19th century came so far with thermodynamics that they thought there are only a "few clouds" on the horizon but that these will be overcome with more accurate observations and better axioms.

Together with the statistical approach a la Boltzmann, Maxwell, Gibbs, and later Planck and Einstein, one had to learn that what was really needed was a "revolutionary act of desperation" and quantum theory had to be developed.
 
  • Like
Likes Philip Koeck
  • #101
Philip Koeck said:
If entropy is only a function of macroscopic variables then you don't even take into account which particle is which. So it shouldn't matter whether they are distinguishable or not.

This is basically the point of Jaynes' phenomenological discussion in his paper. At the phenomenological level, it is of course correct.

However, in the post I was responding to, @vanhees71 was not talking about the phenomenological level. He was talking about a statistical treatment that looks at and counts microstates. At that level, no, entropy is not only a function of macroscopic variables; you have to know the microstates and their distribution.
 
  • Like
Likes vanhees71 and Philip Koeck
  • #102
Of course you have to know the microstates but not necessarily the distribution.

The entropy is a functional of the distribution, and this view makes it possible to use entropy in its information-theoretical foundation as a way to deduce the probabilities (or probability distribution for continuous observables) given the available information by the maximum entropy principle. It's a way to deduce a probability distribution of "least prejudice" based on the available information, i.e., it maximizes entropy as the measure of the "missing information" given a probability distribution under the constraint to be compatible with your knowledge about the state.

For the special case of equilibrium the only knowledge is about the conserved quantities of the system, and the corresponding constraints can be described also in different ways leading to the microcanonical, canonical, and grand canonical ensembles, which are all phenomenologically equivalent (only!) in the "thermodynamic limit", i.e., for really macroscopic systems with very many particles.
 
  • Like
Likes Philip Koeck
  • #103
I'm trying to understand the different levels of macro and micro that seem to be involved here.

Just checking whether people agree with my thinking.

I'll take the microcanonical derivation of BE or FD-statistics for example.
In my text on ResearchGate (mentioned earlier) I use W for the number of different ways a particular distribution of particles among energy levels can be realized.
In equilibrium this number W is maximized under the constraint of constant energy and possibly constant number of particles.
I believe this should mean that for a given system (with given energy levels and single particle states on each level) the distribution among energy levels with the largest W depends only on the total energy and number of particles.
So at equilibrium entropy is actually completely defined by macroscopic variables.
(Here I assume that there is a unique relation between S and W, such as S = k lnW.)

On the other hand there are many other distributions that give the same total energy.
They all have a smaller W and they correspond to non-equilibrium states of the system.
Is statistical entropy also defined as k lnW for these distributions?

Then there is a level that's even more "micro": I could also count how many particles are in each single particle state belonging to each energy level, not just how many there are on each level.
Is this ever used in a definition of entropy?

(Cautiously put in brackets: For distinguishable particles I could even ask which particle is in which single particle state.)
 
  • #104
I think the logical argument is rather to start with counting for a general non-equilibrium state and then derive the equilibrium case from the maximum-entropy principle under the constraints of constant energy and particle number.

For an ideal gas of fermions the counting is like this:

To be able to count you first have to put the particles in a finite box, but to have proper momentum operators you should impose periodic rather than rigid boundary conditions. Then the single-particle phase-space density turns out to be (counting the number of momentum eigenstates in a momentum volume ##\mathrm{d}^3 p## around ##\vec{p}_j##)
$$G_j=g \frac{\mathrm{d}^3 x \mathrm{d}^3p}{(2 \pi \hbar)^3},$$
where ##g=(2s+1)## is the degeneracy due to spin.

Now consider the number of possibilities to put ##N_j## particles in phase-space cell ##j##. For fermions ##N_j \leq G_j## since you can put only one particle into each one-particle state. Thus the number of possibilities is the same as drawing ##N_j## balls out of an urn containing ##G_j## labelled balls without repetition, i.e.,
$$\Gamma_j=\binom{G_j}{N_j}=\frac{G_j!}{N_j!(G_j-N_j)!}.$$
The entropy is given by
$$S=k \sum_j \Gamma_j \simeq k_{\text{B}} \sum_j [G_j \ln G_j -N_j \ln N_j-(G_j-N_j)\ln(G_j-N_j)].$$
Defining
$$N_j =G_j f_j=g \mathrm{d}^3 x \mathrm{d}^3 p/(2 \pi \hbar)^3, \quad N_j/G_j=f_j$$
you get
$$S=-k \sum G_j [f_j \ln f_j+(1-f_j) \ln(1-f_j)] = -k \int_{\mathbb{R}^6} \frac{\mathrm{d}^3 x \mathrm{d}^3 p}{(2 \pi \hbar)^3} g [f \ln f+(1-f) \ln(1-f)].$$
To find the equilibrium distribution it's most simple to use the grand-canonical ensemble and maximizing the entropy under the constraints of given average energy and particle number, using ##E(\vec{p})=\vec{p}^2/(2m)## (or ##E(\vec{p})=c \sqrt{m^2 c^2+\vec{p}^2}## for relativistic gases)
$$U=\int_{\mathbb{R}^3} \frac{\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3} g E_{\vec{p}} f(\vec{x},\vec{p}),$$
$$N=\int_{\mathbb{R}^3} \frac{\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3} g f(\vec{x},\vec{p}),$$
With the Lagrangemultipliers ##\lambda_1## and ##\lambda_2## you get
$$\delta S-\lambda_1 \delta U -\lambda_2 \delta N=-k \int_{\mathbb{R}^6} \frac{\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3} [\delta f \ln f -\delta f \ln (1-f) +\delta f- \lambda_1 E(\vec{p}) \delta f -\lambda_2 \delta f] \stackrel{!}{=}0.$$
Since this must hold for all variations ##\delta f## we find
$$\ln[f/(1-f)]=\lambda_1 E(\vec{p})+\lambda_2$$
or
$$\frac{f}{1-f}=\exp(\lambda_1 E+\lambda_2) \; \Rightarrow\; f=\frac{1}{\exp(\lambda_1 E(\vec{p})+\lambda_2)+1}.$$
The usual analysis of the resulting thermodynamics gives ##\lambda_1=1/(k T)## and ##\lambda_2=-\mu/(k T)##, where ##\mu## is the chemical potential and ##T## the (absolute) temperature of the gas,
$$f=\frac{1}{\exp[(E(\vec{p})-\mu)/(k T)]+1},$$
which is the Fermi-Dirac distribution of an ideal gas, as it should be.
 
  • Like
Likes dextercioby, etotheipi and Philip Koeck
  • #105
PeterDonis said:
Yes, Jaynes explicitly says that Pauli did not prove that entropy must be extensive, he just assumed it and showed what the phenomenology would have to look like if it was
Jaynes, in section 7 of his paper, shows how this factor arises, in appropriate cases (as he notes, entropy is not always perfectly extensive), in a proper analysis that includes the effects of changing ##N##. As he comments earlier in his paper (and as I have referenced him in previous posts), if your analysis only considers the case of constant ##N##, you can't say anything about how entropy varies as ##N## changes. To address the question of extensivity of entropy at all, you have to analyze processes where ##N## changes.

I believe my argument addresses the case where N changes, as it deals with two systems that exchange particles.

Considering a simpler problem may help illustrate my point. There are two bags, one has ##V_1## pockets and the other has ##V_2## pockets. There are ##N## balls, each ball has a different number written on it, so that they are distinguishable. Someone knows at any moment in what poket are each of the balls. You play a game where at each step you chose randomly one of the balls and moves it to one of the pockets, also chosen randomly. After a large number of steps of the game, what would be the most probable number ##N_1## of balls in the bag with ##V_1## pockets?

I guess that most people would agree that simple intuition suggests that ##N_1=N V_1/(V_1+V_2)##. But how to show that mathematically?

There will be ##\Omega_1=V_1^{N_1}## ways to put ##N_1## balls in one bag and ##\Omega_2=V_2^{N_2}## ways to put ##N_2## balls in the other bag.

If you are one of those that believe that there is no logical reason to divide by the factorial, since the balls are distinguishable, I guess you believe that the equilibrium maximizes ##\Omega_1(N_1)×\Omega_2(N_2)##.
If you apply the logarithm and differentiate with ##N_1##, considering that ##N_1+N_2=N## you get ##\ln(V_1)=\ln(V_2)## as equilibrium condition. Since the numbers ##V_1## and ##V_2## are arbitrary, that would suggest that there is no equilibrium. One may believe that this is a paradox. One may insist that the only possible explanation for the existence of an equilibrium is that the balls are in fact indistinguishable and a permutation would not count as a different state. In fact, there is no paradox, only bad math.

Since the balls are distinguishable the number of states for the system composed of the two bags is
##\Omega_1(N_1)×\Omega_2(N_2)×[N!/(N_1! N_2!)]##.
Note that the term ##[N!/(N_1! N_2!)]## is not included to obtain a consistent equilibrium. It is included to correctly count all possible states for the system. I am still waiting for those who clain that there is no logical reason to include of this term to explain how else would they count states...

Including the nessessary ##[N!/(N_1! N_2!)]##, applying the logarithm, and differentiating with ##N_1##, you get ##\ln(V_1/N_1)=\ln(V_2/N_2)##. Of course, you have to remember that ##N_1+N_2=N##, and consider that both ##N_1## and ##N_2## are large enough to use Stirling.

My conclusion, when one uses proper combinatorial logic, one sees that, for a systems of distinguishable elements, equilibrium happens when
##\partial\ln[\Omega_1(N_1)/N_1!]/\partial N_1=\partial\ln[\Omega_2(N_2)/N_2!]/\partial N_2##.

Most people would agree that the definition of entropy is such that
##\partial S_1(N_1)/\partial N_1=\partial S_2(N_2)/\partial N_2##, and that leads to
##S_1=k \ln[\Omega_1(N_1)/N_1!]## and ##S_2=k \ln[\Omega_2(N_2)/N_2!]##.

Note also that this argument is consistent with ##S= k \ln(W)##, only that S is the entropy of the system composed by the two bags, and ##W## is proportional to the number of accessible states for this system composed by the two bags.

This point is stressed by Swendsen, and I quote:

"Although Boltzmann never addressed Gibbs’ Paradox directly, his approach to statistical mechanics provides a solid basis for its resolution. Boltzmann defined the entropy in terms of the probability of the macroscopic state of a composite system. Although the traditional definition of the entropy is often attributed to Boltzmann, this attribution is not correct. The equation on Boltzmann’s tombstone, ##S = k \log W##, which is sometimes called in evidence, was never written by Boltzmann and does not refer to the logarithm of a volume in phase space. The equation was first written down by Max Planck, who correctly attributed the ideas behind it to Boltzmann. Planck also stated explicitly that the symbol “W” stands for the German word “Wahrscheinlichkeit” (which means probability) and refers to the probability of a macroscopic state. The dependence of Boltzmann’s entropy on the number of particles requires the calculation of the probability of the number of distinguishable particles in the each subsystem of a composite system. The calculation of this probability requires the inclusion of the binomial coefficent, ##N!/(N_1!N_2!)##, where ##N_1## and ##N_2## are the numbers of particles in each subsystem and ##N = N_1 + N_2##. This binomial coefficient is the origin of the missing factor of ##1/N!## in the traditional definition, and leads to an expression for the entropy that is extensive."

From
Gibbs’ Paradox and the Definition of Entropy
By
Robert H. Swendsen
Entropy 2008, 10, 15-18
 
  • Like
Likes Philip Koeck and dextercioby
  • #106
autoUFC said:
I believe my argument addresses the case where N changes, as it deals with two systems that exchange particles.

It's not enough for two systems to exchange particles. The number of particles assigned to a system has to change. If you have two halves of a container of gas, each half containing ##N## particles, with no barrier between them, the two systems (two halves of the container) can exchange particles, but ##N## is still not changing; you still have ##N## particles in each half.

For ##N## to change, you would have to have a barrier between the halves and introduce some kind of process, like an osmotic pressure gradient (with the barrier a semi-permeable membrane), that would move particles one way across the barrier but not the other. And then you would have to add a chemical potential term to your equations, as Jaynes describes.
 
  • Like
Likes vanhees71
  • #107
PeterDonis said:
It's not enough for two systems to exchange particles. The number of particles assigned to a system has to change. If you have two halves of a container of gas, each half containing ##N## particles, with no barrier between them, the two systems (two halves of the container) can exchange particles, but ##N## is still not changing; you still have ##N## particles in each half.

For ##N## to change, you would have to have a barrier between the halves and introduce some kind of process, like an osmotic pressure gradient (with the barrier a semi-permeable membrane), that would move particles one way across the barrier but not the other. And then you would have to add a chemical potential term to your equations, as Jaynes describes.

Why?

In any book of termodinamics or statistical mechanics one sees several exemples of termodinamic processes of isolated systems that are composed of subsystems. In fact, all those statements such as the second law of termodinamics are stated in regard to isolated systems.

In any way, how a semi-permeable menbrane or osmotic gradient would change the number of particles? In an isolated system N keeps constant no matter what. Unless one considers something like chemical reactions.It seens to me that these are unessessary complications, since the question in hand, the inclusion of ##1/N!## is explained by a simple isolated composite system, as Swendsen and others have demonstrated.

Can you be more clear about what is your objecton?
 
  • #108
autoUFC said:
how a semi-permeable menbrane or osmotic gradient would change the number of particles?

Because particles would be able to pass through the membrane one way, but not the other, so the number of particles in both halves would change each time a particle passed through the membrane.

Other cases involving variation of ##N## include phase changes (Jaynes mentions Gibbs' analysis of vapor pressures) and chemical reactions (which is where the term "chemical potential" as a name for the coefficient of the ##dN## term in the entropy originally came from).
 
  • #109
autoUFC said:
Can you be more clear about what is your objecton?

I'm not sure what "objection" you are talking about. In the quote of mine that you referenced in your post #105, I wasn't even responding to you.

If you intend your argument as a general proof that entropy must be extensive in classical mechanics with distinguishable particles, then your argument must have a flaw somewhere, since entropy is not always extensive in classical mechanics with distinguishable particles (Jaynes in his paper gives examples of cases where it isn't).
 
  • #110
PeterDonis said:
I'm not sure what "objection" you are talking about. In the quote of mine that you referenced in your post #105, I wasn't even responding to you.

If you intend your argument as a general proof that entropy must be extensive in classical mechanics with distinguishable particles, then your argument must have a flaw somewhere, since entropy is not always extensive in classical mechanics with distinguishable particles (Jaynes in his paper gives examples of cases where it isn't).
My argument is that by combinatorial logic the entropy of a system of N permutable elements is ##S=k \ln(\Omega(N)/N!)##. Identical classical particles are an exemple of permutable elements as one assumes that swapping two of then counts as a different state.

Extensivity is a consequence of this. The only situation I can see where extensivity would not hold is in the case where statistical independence does not hold. To be clear, I would say that two systems are statistically independent if the number of accessible states for a system do not depend on the microstate the other system is. This is an usual requirement for a consistent extensive entropy. Stars in a galaxy is an example of a system where statistical independence does not hold, due to the fact that gravitational interactions are of long range. Can you say if the exemples Jaynes mention are of this kind where there is no independence?

Maybe you are considering that the entropy of mixing of two different kinds of gases would be an exemple of non-extensivity? That is not the case. Entropy of mixing is not a violation of extensivity.
 
  • #111
autoUFC said:
Entropy of mixing is not a violation of extensivity.

Certainly it is. You have two systems, with ##N_1## and ##N_2## particles and entropies ##S_1## and ##S_2##. You mix the two systems together to form a total system with ##N = N_1 + N_2## particles, but the new system's entropy is not ##S_1 + S_2##; it is ##S = S_1 + S_2 + S_\text{mixing}##.
 
  • Like
Likes etotheipi and vanhees71
  • #112
PeterDonis said:
Certainly it is. You have two systems, with ##N_1## and ##N_2## particles and entropies ##S_1## and ##S_2##. You mix the two systems together to form a total system with ##N = N_1 + N_2## particles, but the new system's entropy is not ##S_1 + S_2##; it is ##S = S_1 + S_2 + S_\text{mixing}##.
Ok. That does not make entropy not extensive. Before you mix, entropy is ##S = S_1 + S_2##. After you mix ##S = S_1 + S_2 + S_\text{mixing}= S'_1 + S'_2##, where ##S'_1## and ##S'_2## are the entropy of each subsystem after mixing, that is larger because mixing is a process out of equilibrium. However, before and after mixing the total entropy is the sum of the entropies of each subsystem, because mixing certainly is not the same as non-extensivity.
 
  • #113
Of course, the Gibbs paradox only occurs when the particles are indistinguishable, and there are two separate issues to be solved:

(a) The nonextensivity of entropy when using the classical counting, where all identical particles have to be considered as distinguishable. Boltzmann and Gibbs solved this problem, contradicting phenomenological entropy, by introducing the factor ##1/N!##, leading to the Sackur-Tetrode formula in the case of ideal gases, which is, within the classical realm (i.e., where Bose-Einstein or Fermi-Dirac quantum features are negligible, i.e., the gas is "non-degenerated").

(b) Having accepted this adaption of the counting of states by borrowing quantum-indistinguishability of identical particles for classical statistics. There is still mixing entropy, and it's well justified if you have non-identical ideal gases, first separated in two partial volumes but at the same pressure and temperature, i.e., the numbers of particles fulfill ##N_1/N_2=V_1/V_2## and then adiabatically taking out the dividing wall and let the two gases diffuse into each other and mix such that you are in equilibrium of the mixing at the same temperature and pressure, the entropy increases, and that's the mixing entropy.

The apparent paradox is that the only thing you need to assume is that the gas molecules are not identical, and this can be an apparently small difference (like different isotopes of the same atoms or even the same atoms in different intrinsic states like ortho- and para-helium). The point, however, is that the particles of the two gases are in some way distinguishable, and then you have to count such that you get the mixing entropy, which is always the same, no matter how small the distinguishing feature of the two sorts of gases might be: ##S_{\text{mix}}=k [(N_1+N_2) \ln(N_1+N_2)-N_1 \ln(N_1)-N_2 \ln N_2]=k[N_1 \ln (N/N_1) + N_2 \ln(N/N_2)]>0##. At the moment, where you have identical gases the mixing entropy must vanish.

Of course, once having accepted the indistinguishability of identical particles in the counting of microstates, borrowed from quantum theory and applied in otherwise classical statistics, and accepting the information-theoretical meaning of entropy, there's no more Gibbs paradox, because if you distribute identical particles to (equilibrium) microstates it doesn't matter whether you keep the dividing wall or not when counting the microstates given the equilibrium conditions (same temperature and pressure implies for ideal gases simply ##N_1/N_2=V_1/V_2##, and you just through indistinguishable particles into the entire volume ##V_1+V_2##, no matter whether there is the divider in place or not). There's no measurable difference about the gases in the one or the other part of the total volume whether there's the divider or not and thus there's no increase of entropy when the gases diffuse after adiabatically taking out the wall.

As Pauli rightly says in his book: There's no smooth transition between the case with non-identical and identical gases in the two partial volumes and thus there's no paradox in having the same finite mixing entropy for non-identical gases vs. zero mixing energy of identical particles in the setup of the Gibbs paradox.

But that is, of course, also a generic quantum feature, i.e., (a) identical particles are really indistinguishable: In contradistinction to macroscopic classical "particles" quantum particles are really indistinguishable. E.g., each electron has precisely the same intrinsic quantum number without the slightest deviation, i.e., the same mass, spin 1/2, electric charge, weak isospin, and baryon number and (b) there's no way to follow individual identical particles if not strictly separated by spatial constraints and thus the full Hamiltonian of the many-body system commutes with all permutation operators for identical particles. Together with some topological arguments (C. deWitt Morette et al) this implies that for identical quantum particles the many-body states are either totally symmetric (bosons) or anti-symmetric (fermions). Within local relativistic QFT (particularly from the microcausality condition and the realization of the proper orthochronous Poincare group by local unitary transformations of the corresponding field operators) it also follows the relationship between "spin and statistics", i.e., half-integer spin particles must be necessarily fermions and integer-spin particles must be necessarily bosons.

M. G. G. Laidlaw and C. M. DeWitt, Feynman Functional
Integrals for Systems of Indistinguishable Particles, Phys.
Rev. D 3, 1375 (1970),
https://link.aps.org/abstract/PRD/v3/i6/p1375

S. Weinberg, The Quantum Theory of Fields, vol. 1,
Cambridge University Press (1995). All these profound findings are not understandable within classical (statistical) physics!
 
  • Like
Likes hutchphd
  • #114
vanhees71 said:
The point, however, is that the particles of the two gases are in some way distinguishable, and then you have to count such that you get the mixing entropy, which is always the same, no matter how small the distinguishing feature of the two sorts of gases might be: ##S_{\text{mix}}=k [(N_1+N_2) \ln(N_1+N_2)-N_1 \ln(N_1)-N_2 \ln N_2]=k[N_1 \ln (N/N_1) + N_2 \ln(N/N_2)]>0##. At the moment, where you have identical gases the mixing entropy must vanish.

That is not totally precise. You are right that distinguishable particles is a necessary condition for entropy of mixing. However, it is not a sufficient condition.
HPt wrote in his post that

"[There is mixing entropy only] if you know which particle is in which partial volume."

So you have this as the suficient condition, you need to have some SIMPLE way to know what particle is in what subsystem in the beginning to have entropy of mixing.

For instance,in the case of buckballs, one may start with the buckballs with higher molecular mass in one subsystem and the one with lower molecular mass in the other subsystem. In this case you get an increase in entropy by mixing. If you start alread in a scrambled state, and can not determine what buckball is in what subsystem, you do not have an increase in entropy due to mixing.
((Not relevant to the point I am trying to convey here, but I should mention that I am only now apreciating HPt's point that buckballs are small enough to be treated as quantum particles. ))

The point of entropy of mixing being dependent on knowing where each distinguishble element is, is also a feature in my example of the macroscopic balls with a number written on them. There will be entropy increase due to mixing if there is a SIMPLE way to know where each ball is. For instance, if the balls with even numbers are in one subsystem and the ones with odd numbers are in the other subsystem. In this case there is entropy of mixing.

A currious thing is that you may have complete list with all the particles that determines for each one in what subsystem it was in the beginning. You can know precisely the starting point of each particle, but not in a SIMPLE way. Considering that the inital state is an equilibrium state, with then right amount of particles in each subsystem, the entropy of the list of intial conditions is the same as the entropy of the system after mixing. In this case there is no entropy of mixing.

I guess that this could be regarded as a paradox. In my understanding this is a good illustration of the connection between entropy in physics and entropy in information theory.
 
  • #115
autoUFC said:
That is not totally precise. You are right that distinguishable particles is a necessary condition for entropy of mixing. However, it is not a sufficient condition.
HPt wrote in his post that

"[There is mixing entropy only] if you know which particle is in which partial volume."
Of course, if the gases are mixed in both parts of the volume in equilibrium conditions, then of course also nothing changes when taking out the wall. In the Gibbs paradox it's discussed what happens when the non-identical gases are separated in the to parts of the volume and then the dividing wall is adiabatically taken out.
 
  • #116
vanhees71 said:
Of course, if the gases are mixed in both parts of the volume in equilibrium conditions, then of course also nothing changes when taking out the wall. In the Gibbs paradox it's discussed what happens when the non-identical gases are separated in the to parts of the volume and then the dividing wall is adiabatically taken out.
If the particles are distinguishable, as in the cases of the many buckballs with distinct isotopes, the gases are non-identical. But when you remove the partition there is no increase in entropy. Entropy of mixing depends on you having a SIMPLE way to determine what is the initial condition. So, if no such way exists, removing the partition between the two systems of a macroscopicaly large number of distinguishable particles do not increase entropy.
 
  • #117
Exactly what does SIMPLE mean?
How identical do the particles need to be?
Quantum Mechanics gives you an unequivocal answer: same quantum numbers. Otherwise you need to wave your hands which I guess is why this is #117
 
  • Like
Likes vanhees71
  • #118
autoUFC said:
before and after mixing the total entropy is the sum of the entropies of each subsystem

No, it isn't after mixing. There is no way to assign ##S_\text{mixing}## to either subsystem individually; you can only assign it to the whole system. So there is no way to express the total entropy after mixing as the sum of subsystem entropies.

This is an example of the point Jaynes makes in his paper, that when you have multiple systems interacting with each other, the only meaningful entropy is the entropy of the whole system that contains all of them.

autoUFC said:
mixing certainly is not the same as non-extensivity

You're contradicting yourself; you just said mixing does make entropy non-extensive, but now you're denying it.

I think you have not fully thought through this subject.
 
  • Like
Likes vanhees71

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
1K
Replies
4
Views
5K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K