1. Apr 18, 2005

### Palindrom

Hi all.

Because what's bothering me is that in Thermodynamics, the entropy is determined up to a "constant", allegedly, $$$S_0$$$, which in my opinion is dependent of $$$N$$$. And so there shouldn't be a paradox. (Because we would have to chose the functional dependence as $$$S_0 \left( N \right) = - Nk_B \ln \left( N \right) + \tilde S_0$$$)
Am I right? Do you understand what I'm trying to say? :uhh:

2. Apr 18, 2005

### dextercioby

Yes,i understand.Indeed,in thermodynamics,entropy is specified up to a constant.Note that this constant is fixed by imposing the 3-rd principle (Nernst-Planck)

$$\lim_{T\rightarrow 0}S(T,V,N,...)=0$$

So physically acceptable solutions must satisfy the 3 rd principle...

Daniel.

3. Apr 18, 2005

### Palindrom

Yes, but isn't that principle referring only to ideal crystals? (as opposed to ideal gases).

4. Apr 18, 2005

### dextercioby

What ideal crystals?Aaaaaaa,u mean the original Nernst 1909 formulation.Well,since the discovery of Quantum Statistics,things have changed.

Daniel.

5. Apr 18, 2005

### dextercioby

Allow me to quote Ryogo Kubo [1],when he says

"<<The entropy of a chemically uniform body of finite density approaches a limiting value as the temperature goes to absolute 0 regardless of pressure,density or phase>>.It is therefore convenient to take the state at 0K as the standard state (see eq.(2.11)) by assuming

$$\lim_{T\rightarrow 0}S=S_{0}\equiv 0$$ (3.25)

Then the entropy at any state is uniquely determined.The entropy defined in this way is sometimes called absolute entropy".

Daniel.

----------------------------------------------------------------
[1]R.Kubo,"Thermodynamics",NH,1968,page 140.

6. Apr 19, 2005

### Palindrom

How very interesting.
My Prof. said there was a counter example or something (quartz?)
Why do we need Q.S. to assume that the limit is 0? Or does Q.S. tell us that the limit exists?

Thanks a lot, by the way!

7. Apr 19, 2005

### dextercioby

For any possible system,the quantum statistics (which assumes correctly that the microscopical dynamics is governed by von Neumann's equation) is the right way to do it...

I dunno about a counter example.And what specifically does that mean...?The entropy at 0K is not zero,but is it

a)negative.
b)positive.
c)infinite ?

Daniel.

8. Apr 20, 2005

### nickdanger

dextercioby is correct, you generally are going to need to split up your thinking between the classical thermodynamic explanation and issues concerning quantum statistics that become a better explanation of the situation.

The Gibb's paradox says that for classical thermodynamics, entropy is properly an extensive variable of a system (and it is). But when we take a given system of n moles of gas and double it's size we find that indeed the energy doubled (another extensive variable), but some of the standard equations for entropy arrive at double the entropy plus an extra amount equal to 2nR(ln2), which is the entropy of mixing. If we considered the doubling to occur from a quantity of this gas equally divided by a partition and the partition is removed, then the gas should double in entropy, plus the additional entropy of mixing correct? Yes, but only if the molecules had little labels on them to make them distinguishable. If the gas is identical on both sides before we removed the partition, then they have no labels, and all molecules are indistinguishable. So there is no entropy of mixing.

The paradox is why should this issue of distinguishability occur. The two sides did indeed mix and 'unmixing' them is truely irreversible whether we had labels on the individual molecules or not. We of course can back out this extra quantity by saying in our original statistical mechanics that we have over counted, because some of the states were not physically distinguishable when we used the case of indistinguishable particles.

I like W. Pauli's comments on the paradox: "...the entropy of a system as a characteristic of its state which (in contradistinction to energy) depends on our knowledge of the system. If this knowledge of ours is maximal - that is the most precise knowledge which is compatible with the laws of nature...then entropy is alway zero."

I think the problem gets worse with Nernst's postulate at absolute zero. He only says that delta S appears to go to zero at absolute zero. Planck generalizes and says the simplest way to treat this is to say all matter (not just crystals) have zero entropy at absolute zero. But of course this is an approximation when dealing with macroscopic systems that are nowhere near zero. Classical thermodynamics does not actually treat 'absolute' values of entropy very well. It's meant to handle entropy differences. And that is the whole point.... when dealing with higher temperatures, it's convenient to have an absolute reference. But the closer we go to absolute zero the worse that estimate concerning the reference becomes.

Terrific topic you've brought up, it's got me thinking.