
#1
Apr1805, 11:57 AM

P: 266

Hi all.
About Gibbs' paradox (with the mixture entropy for ideal gases): Is the paradox a Thermodynamic paradox? Because what's bothering me is that in Thermodynamics, the entropy is determined up to a "constant", allegedly, [tex]\[ S_0 \] [/tex], which in my opinion is dependent of [tex]\[ N \] [/tex]. And so there shouldn't be a paradox. (Because we would have to chose the functional dependence as [tex]\[ S_0 \left( N \right) =  Nk_B \ln \left( N \right) + \tilde S_0 \] [/tex]) Am I right? Do you understand what I'm trying to say? 



#2
Apr1805, 01:49 PM

Sci Advisor
HW Helper
P: 11,863

Yes,i understand.Indeed,in thermodynamics,entropy is specified up to a constant.Note that this constant is fixed by imposing the 3rd principle (NernstPlanck)
[tex] \lim_{T\rightarrow 0}S(T,V,N,...)=0 [/tex] So physically acceptable solutions must satisfy the 3 rd principle... Daniel. 



#3
Apr1805, 02:32 PM

P: 266

Yes, but isn't that principle referring only to ideal crystals? (as opposed to ideal gases).




#4
Apr1805, 02:53 PM

Sci Advisor
HW Helper
P: 11,863

Gibbs' Paradox
What ideal crystals?Aaaaaaa,u mean the original Nernst 1909 formulation.Well,since the discovery of Quantum Statistics,things have changed.
Daniel. 



#5
Apr1805, 03:27 PM

Sci Advisor
HW Helper
P: 11,863

Allow me to quote Ryogo Kubo [1],when he says
"<<The entropy of a chemically uniform body of finite density approaches a limiting value as the temperature goes to absolute 0 regardless of pressure,density or phase>>.It is therefore convenient to take the state at 0K as the standard state (see eq.(2.11)) by assuming [tex] \lim_{T\rightarrow 0}S=S_{0}\equiv 0 [/tex] (3.25) Then the entropy at any state is uniquely determined.The entropy defined in this way is sometimes called absolute entropy". Daniel.  [1]R.Kubo,"Thermodynamics",NH,1968,page 140. 



#6
Apr1905, 10:02 AM

P: 266

How very interesting.
My Prof. said there was a counter example or something (quartz?) Why do we need Q.S. to assume that the limit is 0? Or does Q.S. tell us that the limit exists? Thanks a lot, by the way! 



#7
Apr1905, 03:50 PM

Sci Advisor
HW Helper
P: 11,863

For any possible system,the quantum statistics (which assumes correctly that the microscopical dynamics is governed by von Neumann's equation) is the right way to do it...
I dunno about a counter example.And what specifically does that mean...?The entropy at 0K is not zero,but is it a)negative. b)positive. c)infinite ? Daniel. 


#8
Apr2005, 05:38 AM

P: n/a

The Gibb's paradox says that for classical thermodynamics, entropy is properly an extensive variable of a system (and it is). But when we take a given system of n moles of gas and double it's size we find that indeed the energy doubled (another extensive variable), but some of the standard equations for entropy arrive at double the entropy plus an extra amount equal to 2nR(ln2), which is the entropy of mixing. If we considered the doubling to occur from a quantity of this gas equally divided by a partition and the partition is removed, then the gas should double in entropy, plus the additional entropy of mixing correct? Yes, but only if the molecules had little labels on them to make them distinguishable. If the gas is identical on both sides before we removed the partition, then they have no labels, and all molecules are indistinguishable. So there is no entropy of mixing. The paradox is why should this issue of distinguishability occur. The two sides did indeed mix and 'unmixing' them is truely irreversible whether we had labels on the individual molecules or not. We of course can back out this extra quantity by saying in our original statistical mechanics that we have over counted, because some of the states were not physically distinguishable when we used the case of indistinguishable particles. I like W. Pauli's comments on the paradox: "...the entropy of a system as a characteristic of its state which (in contradistinction to energy) depends on our knowledge of the system. If this knowledge of ours is maximal  that is the most precise knowledge which is compatible with the laws of nature...then entropy is alway zero." I think the problem gets worse with Nernst's postulate at absolute zero. He only says that delta S appears to go to zero at absolute zero. Planck generalizes and says the simplest way to treat this is to say all matter (not just crystals) have zero entropy at absolute zero. But of course this is an approximation when dealing with macroscopic systems that are nowhere near zero. Classical thermodynamics does not actually treat 'absolute' values of entropy very well. It's meant to handle entropy differences. And that is the whole point.... when dealing with higher temperatures, it's convenient to have an absolute reference. But the closer we go to absolute zero the worse that estimate concerning the reference becomes. Terrific topic you've brought up, it's got me thinking. 


Register to reply 
Related Discussions  
Gibbs sum  Advanced Physics Homework  3  
What is the resolution of the The BugRivet Paradox paradox in special relativity?  Special & General Relativity  4  
Gibbs sum  Introductory Physics Homework  0  
Gibbs paradox  Classical Physics  1 