fox26
- 40
- 2
Chestermiller said:Chestermiller submitted a new PF Insights post
Understanding Entropy and the 2nd Law of Thermodynamics
View attachment 178074Continue reading the Original PF Insights Post.
Chestermiller said:Wow. Thank you for finally clarifying your question.
You are asking how the absolute entropy of a system can be determined. This is covered by the 3rd Law of Thermodynamics. I never mentioned the 3rd Law of Thermodynamics in my article. You indicated that, in my article, I said that "some important person in thermodynamics, I don't remember who, so call him "X" (maybe it was Clausius), had determined that the entropy of any system consisting of matter (in equilibrium) at absolute zero would be zero, so letting S2 = SYS at absolute zero, we would have entropy(S2) = 0." I never said this in my article or in any of my comments. If you think so, please point out where. My article only deals with relative changes in entropy from one thermodynamic equilibrium state to another.
Chet,
My statement about what you had said regarding 0 entropy at 0° Kelvin did not involve a direct
quote from you, using “ “, it involved an indirect quote, using the word “that”, and included a part
which I wasn’t attributing to you, the “some important person in thermodynamics, I don’t remember
who, so call him “X” (maybe it was Clausius)” --that was my comment about what you had said. I
admit it wasn't perfectly clear which parts were ones that I was saying that you had said, and which
were mine, but making such things completely unambiguous in the English language often, as with
what I intended to say in this case, requires overly long and awkward constructions. Also, I didn’t
say that you had made the 0 entropy at 0° K statement in your article; in fact, I thought that you
had made it while replying to a comment about your article, but after you stated in your email that
you hadn't said it in your article or in any of your comments, I looked back over them, and found
that it had occurred in a quote from INFO-MAN which you had included in one of your comments.
X in that quote was "Kelvin", not "Clausius". According to INFO-MAN, Kelvin had said that a pure
substance (mono-molecular?--fox26's question, not Kelvin’s) at absolute zero would have zero
entropy. Using "entropy" in the statistical mechanical sense, this statement attributed to Kelvin is
true (classically, not quantum mechanically).
Fine, but that brings up what may be a serious problem with the thermodynamics equation:
Δ(entropy) for a reversible process between equilibrium states A and B of a system SYS = the
integral of dq/T between A and B. If SYS is a pure gas in a closed container, and A is SYS at 0° K,
and the relation between dq and dT, which one must know to evaluate the integral, is either
dq = C(dT), which you've used in evaluating such integrals, with C = the (constant) heat capacity,
say at constant volume, of SYS, or dq = k(dT/2)x(the number of degrees of freedom of SYS), which
is implied by the Equipartition Theorem, then the integral of dq/T between A and B is [the integral,
between 0° K and the final temperature T1, of some non-zero constant P times dT/T] =
P[ln(T1) - ln(0)] = ∞ (infinity [for T1 > 0], but actually even then it might be better to regard the
integral as not defined). This problem isn’t solved by requiring the lower (starting) temperature
T0 to be non-zero, but allowing it to be anything above zero, because the integral between
T0 and any T1 > 0 can be made arbitrarily (finitely) large by making T0 some suitably small but
non-zero temperature. Thus, if (1), Kelvin’s sentence is true with “entropy” having the
thermodynamic as well as with it having the statistical mechanical meaning, (2), the Δ(entropy) =
∫dq/T law is true for thermodynamic as well as statistical mechanical entropy, and (3), a linear
relation between dq and dT holds, then the thermodynamic entropy for any (non-empty)
system in equilibrium and at any temperature T1 above absolute zero can’t be finite, even though
the statistical mechanical entropy for such a (finite) system can be made arbitrarily small by taking
T1 to be some suitable temperature > 0° K. Surely the thermodynamic entropy can’t be so different
from the statistical mechanical entropy that the conclusion of the previous sentence is true. The
problem's solution might be that the heat capacity C varies at low temperatures in such a way, for
example C ∝ √T, that the integral is finite, or that the Equipartition Theorem breaks down at low
temperatures, but at least for systems which are a gas composed of classical point particles
interacting, elastically, only when they collide, which is an ideal gas (never mind that they would
almost never collide), the Equipartition Theorem leads to, maybe is equivalent to, the Ideal Gas Law,
which can be mathematically shown to be true for such a gas, even down to absolute zero, and
experimentally breaks down severely, at low temperatures with real gases, only because of their
departures, including their being quantum mechanical, from the stated conditions. What is the
solution of this problem? Must thermodynamics give up the Δ(entropy) = ∫dq/T law as an exact, and
for low temperatures as even a nearly exact, law?