Gibbs' Paradox and Ideal Gas Mixture Entropy

Click For Summary
SUMMARY

The discussion centers on Gibbs' Paradox and its implications for the entropy of ideal gas mixtures in thermodynamics. Participants clarify that entropy is determined up to a constant, denoted as \( S_0 \), which can depend on the number of particles \( N \). The paradox arises from the indistinguishability of particles, leading to an apparent contradiction in entropy calculations when mixing identical gases. Quantum statistics provide a more accurate framework for understanding these phenomena, particularly at absolute zero, where entropy approaches zero.

PREREQUISITES
  • Understanding of Gibbs' Paradox in thermodynamics
  • Familiarity with classical thermodynamics and entropy concepts
  • Knowledge of quantum statistics and its implications
  • Basic grasp of Nernst's postulate and absolute zero
NEXT STEPS
  • Study the implications of Gibbs' Paradox on entropy calculations in ideal gases
  • Explore the role of quantum statistics in thermodynamics
  • Investigate Nernst's postulate and its application to various states of matter
  • Learn about the relationship between entropy and distinguishability in statistical mechanics
USEFUL FOR

Physicists, thermodynamic researchers, and students studying statistical mechanics who seek to deepen their understanding of entropy and its foundational principles in thermodynamics.

Palindrom
Messages
263
Reaction score
0
Hi all.

About Gibbs' paradox (with the mixture entropy for ideal gases): Is the paradox a Thermodynamic paradox?
Because what's bothering me is that in Thermodynamics, the entropy is determined up to a "constant", allegedly, \[<br /> S_0 <br /> \]<br />, which in my opinion is dependent of \[<br /> N<br /> \]<br />. And so there shouldn't be a paradox. (Because we would have to chose the functional dependence as \[<br /> S_0 \left( N \right) = - Nk_B \ln \left( N \right) + \tilde S_0 <br /> \]<br />)
Am I right? Do you understand what I'm trying to say? :rolleyes:
 
Science news on Phys.org
Yes,i understand.Indeed,in thermodynamics,entropy is specified up to a constant.Note that this constant is fixed by imposing the 3-rd principle (Nernst-Planck)

\lim_{T\rightarrow 0}S(T,V,N,...)=0

So physically acceptable solutions must satisfy the 3 rd principle...

Daniel.
 
  • Like
Likes   Reactions: gentzen
Yes, but isn't that principle referring only to ideal crystals? (as opposed to ideal gases).
 
What ideal crystals?Aaaaaaa,u mean the original Nernst 1909 formulation.Well,since the discovery of Quantum Statistics,things have changed.

Daniel.
 
Allow me to quote Ryogo Kubo [1],when he says

"<<The entropy of a chemically uniform body of finite density approaches a limiting value as the temperature goes to absolute 0 regardless of pressure,density or phase>>.It is therefore convenient to take the state at 0K as the standard state (see eq.(2.11)) by assuming

\lim_{T\rightarrow 0}S=S_{0}\equiv 0 (3.25)

Then the entropy at any state is uniquely determined.The entropy defined in this way is sometimes called absolute entropy".

Daniel.

----------------------------------------------------------------
[1]R.Kubo,"Thermodynamics",NH,1968,page 140.
 
How very interesting.
My Prof. said there was a counter example or something (quartz?)
Why do we need Q.S. to assume that the limit is 0? Or does Q.S. tell us that the limit exists?

Thanks a lot, by the way!
 
For any possible system,the quantum statistics (which assumes correctly that the microscopical dynamics is governed by von Neumann's equation) is the right way to do it...

I don't know about a counter example.And what specifically does that mean...?The entropy at 0K is not zero,but is it

a)negative.
b)positive.
c)infinite ?

Daniel.
 
Palindrom said:
Hi all.

About Gibbs' paradox (with the mixture entropy for ideal gases): Is the paradox a Thermodynamic paradox?
Because what's bothering me is that in Thermodynamics, the entropy is determined up to a "constant", allegedly, \[<br /> S_0 <br /> \]<br />, which in my opinion is dependent of \[<br /> N<br /> \]<br />. And so there shouldn't be a paradox. (Because we would have to chose the functional dependence as \[<br /> S_0 \left( N \right) = - Nk_B \ln \left( N \right) + \tilde S_0 <br /> \]<br />)
Am I right? Do you understand what I'm trying to say? :rolleyes:

dextercioby is correct, you generally are going to need to split up your thinking between the classical thermodynamic explanation and issues concerning quantum statistics that become a better explanation of the situation.

The Gibb's paradox says that for classical thermodynamics, entropy is properly an extensive variable of a system (and it is). But when we take a given system of n moles of gas and double it's size we find that indeed the energy doubled (another extensive variable), but some of the standard equations for entropy arrive at double the entropy plus an extra amount equal to 2nR(ln2), which is the entropy of mixing. If we considered the doubling to occur from a quantity of this gas equally divided by a partition and the partition is removed, then the gas should double in entropy, plus the additional entropy of mixing correct? Yes, but only if the molecules had little labels on them to make them distinguishable. If the gas is identical on both sides before we removed the partition, then they have no labels, and all molecules are indistinguishable. So there is no entropy of mixing.

The paradox is why should this issue of distinguishability occur. The two sides did indeed mix and 'unmixing' them is truly irreversible whether we had labels on the individual molecules or not. We of course can back out this extra quantity by saying in our original statistical mechanics that we have over counted, because some of the states were not physically distinguishable when we used the case of indistinguishable particles.

I like W. Pauli's comments on the paradox: "...the entropy of a system as a characteristic of its state which (in contradistinction to energy) depends on our knowledge of the system. If this knowledge of ours is maximal - that is the most precise knowledge which is compatible with the laws of nature...then entropy is alway zero."

I think the problem gets worse with Nernst's postulate at absolute zero. He only says that delta S appears to go to zero at absolute zero. Planck generalizes and says the simplest way to treat this is to say all matter (not just crystals) have zero entropy at absolute zero. But of course this is an approximation when dealing with macroscopic systems that are nowhere near zero. Classical thermodynamics does not actually treat 'absolute' values of entropy very well. It's meant to handle entropy differences. And that is the whole point... when dealing with higher temperatures, it's convenient to have an absolute reference. But the closer we go to absolute zero the worse that estimate concerning the reference becomes.

Terrific topic you've brought up, it's got me thinking.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 22 ·
Replies
22
Views
6K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K