1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Insights Understanding Entropy and the 2nd Law of Thermodynamics - comments

  1. Dec 16, 2017 #61
    Chet,
    My statement about what you had said regarding 0 entropy at 0° Kelvin did not involve a direct
    quote from you, using “ “, it involved an indirect quote, using the word “that”, and included a part
    which I wasn’t attributing to you, the “some important person in thermodynamics, I don’t remember
    who, so call him “X” (maybe it was Clausius)” --that was my comment about what you had said. I
    admit it wasn't perfectly clear which parts were ones that I was saying that you had said, and which
    were mine, but making such things completely unambiguous in the English language often, as with
    what I intended to say in this case, requires overly long and awkward constructions. Also, I didn’t
    say that you had made the 0 entropy at 0° K statement in your article; in fact, I thought that you
    had made it while replying to a comment about your article, but after you stated in your email that
    you hadn't said it in your article or in any of your comments, I looked back over them, and found
    that it had occurred in a quote from INFO-MAN which you had included in one of your comments.
    X in that quote was "Kelvin", not "Clausius". According to INFO-MAN, Kelvin had said that a pure
    substance (mono-molecular?--fox26's question, not Kelvin’s) at absolute zero would have zero
    entropy. Using "entropy" in the statistical mechanical sense, this statement attributed to Kelvin is
    true (classically, not quantum mechanically).

    Fine, but that brings up what may be a serious problem with the thermodynamics equation:
    Δ(entropy) for a reversible process between equilibrium states A and B of a system SYS = the
    integral of dq/T between A and B. If SYS is a pure gas in a closed container, and A is SYS at 0° K,
    and the relation between dq and dT, which one must know to evaluate the integral, is either
    dq = C(dT), which you've used in evaluating such integrals, with C = the (constant) heat capacity,
    say at constant volume, of SYS, or dq = k(dT/2)x(the number of degrees of freedom of SYS), which
    is implied by the Equipartition Theorem, then the integral of dq/T between A and B is [the integral,
    between 0° K and the final temperature T1, of some non-zero constant P times dT/T] =
    P[ln(T1) - ln(0)] = ∞ (infinity [for T1 > 0], but actually even then it might be better to regard the
    integral as not defined). This problem isn’t solved by requiring the lower (starting) temperature
    T0 to be non-zero, but allowing it to be anything above zero, because the integral between
    T0 and any T1 > 0 can be made arbitrarily (finitely) large by making T0 some suitably small but
    non-zero temperature. Thus, if (1), Kelvin’s sentence is true with “entropy” having the
    thermodynamic as well as with it having the statistical mechanical meaning, (2), the Δ(entropy) =
    ∫dq/T law is true for thermodynamic as well as statistical mechanical entropy, and (3), a linear
    relation between dq and dT holds
    , then the thermodynamic entropy for any (non-empty)
    system in equilibrium and at any temperature T1 above absolute zero can’t be finite, even though
    the statistical mechanical entropy for such a (finite) system can be made arbitrarily small by taking
    T1 to be some suitable temperature > 0° K. Surely the thermodynamic entropy can’t be so different
    from the statistical mechanical entropy that the conclusion of the previous sentence is true. The
    problem's solution might be that the heat capacity C varies at low temperatures in such a way, for
    example C ∝ √T, that the integral is finite, or that the Equipartition Theorem breaks down at low
    temperatures, but at least for systems which are a gas composed of classical point particles
    interacting, elastically, only when they collide, which is an ideal gas (never mind that they would
    almost never collide), the Equipartition Theorem leads to, maybe is equivalent to, the Ideal Gas Law,
    which can be mathematically shown to be true for such a gas, even down to absolute zero, and
    experimentally breaks down severely, at low temperatures with real gases, only because of their
    departures, including their being quantum mechanical, from the stated conditions. What is the
    solution of this problem? Must thermodynamics give up the Δ(entropy) = ∫dq/T law as an exact, and
    for low temperatures as even a nearly exact, law?
     
  2. Dec 17, 2017 #62
    For pure materials, the ideal gas is only a model for real gas behavior above the melting point and at low reduced pressures, ##p/p_{critical}##. For real gases, the heat capacity is not constant, and varies with both temperature and pressure. So, the solution to your problem is, first of all, to take into account the temperature-dependence of the heat capacity (and pressure-dependence, if necessary). Secondly, real materials experience phase transitions, such as condensation, freezing, and changes in crystal structure (below the freezing point). So one needs to take into account the latent heat effects of these transitions in calculating the change in entropy. And, finally, before and after phase transitions, the heat capacity of the material can be very different (e.g., ice and liquid water).
     
  3. Dec 18, 2017 #63

    DrDu

    User Avatar
    Science Advisor

    No, the third law was formulated by Walter Nernst. He also did not find that the absolute entropy at T=0 was 0. Rather he found that the entropy of an ideal crystal becomes independent of all the other variables of the system (like p) in the limit T to 0. So entropy at T=0 is a constant and this constant can conveniently be chosen to be 0.
     
  4. Dec 19, 2017 #64

    DrDu

    User Avatar
    Science Advisor

    Even in phenomenological thermodynamics, the heat capacity C generically depends on temperature. The equipartition theorem is a theorem from classical mechanics. It is approximately applicable if the number of quanta in each degree of freedom is >>1. In solids, this leads to the well known rule of Dulong-Petit, stating that the heat capacity per atom in a solid is approximately ##3k_\mathrm{B}##. At lower temperatures, the heat capacity decreases continuously as the degrees of freedom start to "freeze out", with the exception of the sound modes. This leads to the celebrated Debye expression for the heat capacity at low temperatures ##C_V \approx T^3##.
     
  5. Dec 28, 2017 #65
    Chet,
    Is this too long for a comment?

    Thank you for explaining the solution, for real materials, to my infinite entropy change
    problem--maybe; I, as indicated, suspected something of the kind might be the explanation. Do
    you know, however, that taking into account both the variation of heat capacity with temperature
    and pressure, and also phase transitions, always leads to a finite value of ∫dq/T when integrated
    between 0°K and a higher temperature? Do you care? Maybe you are concerned only with
    changes in entropy for processes operating between two non-zero temperatures. Did you use
    entropy change calculations in your chemical engineering job? I know that they can be used in
    some cases to indicate that a proposed process is impossible, by showing that it would involve
    reduction in entropy of an isolated system, thus violating the second law. A well-known example
    is the operation of a Carnot cycle heat engine with efficiency greater than that set by the
    requirement that the reduction in entropy caused by the removal of thermal energy from the high
    temperature heat bath must be accompanied by at least as great an increase in entropy caused
    by the addition of thermal energy to the low temperature heat bath.

    One might think that the Δ(entropy) = ∫dq/T law should always give a finite Δ(entropy) for pure
    ideal gases as well as real materials, but as I (simply) demonstrated, it doesn’t do so for ideal
    gases with the lower temperature equal to 0°K, even though ideal gases would not experience
    any variation of heat capacity with temperature or pressure, or any phase transitions. I recently
    thought about this problem some more, stimulated by the PF discussion, and arrived at a (now
    obvious to me) solution, or at least an explanation, that actually favors the thermodynamic
    definition of entropy change, which involves infinite, or arbitrarily large, entropy change for
    process (involving ideal gases) with starting temperatures at, or arbitrarily near, absolute zero,
    over the statistical mechanical view, which requires any finite system to have only finite absolute
    entropy at any temperature, including absolute zero, so gives only finite entropy change for
    processes, for finite systems, operating between any two states at any two finite temperatures,
    even when one is absolute zero. This solution or explanation will require quite a number of lines
    to state. I hope that it is not so obvious a one that I am wasting your, my, and any other persons
    who are reading this posts’ time by going through it.

    The basic reason that the statistical mechanical (SM) entropy (S) of a pure (classical) ideal gas
    SYS in any equilibrium macrostate at 0°K or any temperature above that is zero or a finite
    positive number, whereas its thermodynamic (THRM) entropy change between an equilibrium
    macrostate of SYS at 0°K and one at any temperature above that is infinite, is that the SM
    entropy of SYS in some equilibrium macrostate MAC is calculated using a discrete
    approximation NA to the uncertainty of the exact state of SYS when in MAC--NA is the number
    of microstates available to SYS when it is in MAC, with some mostly arbitrary definition of the
    size in phase space of a microstate of SYS-- whereas the THRM entropy change between two
    equilibrium macrostates of SYS is calculated using the (multi-dimensional) area or volume in
    phase space of the set of microstates available to SYS when in those macrostates, which can
    be any positive real number (and for a macrostate at 0°K is 0). The details follow:

    The state of an ideal gas SYS composed of N point-particles each of mass m which interact
    only by elastic collision is specified by a point Ps in 6N-dimensional phase space, 3N
    coordinates of them for position and 3N of them for momentum. If the gas is in equilibrium,
    confined to a cube 1 unit on a side, and has a thermal energy of E, SM and THRM both consider Ps
    to be equally likely to be anywhere on the energy surface ES determined by E, which is the set of all
    points corresponding to SYS having a thermal energy of E, and the probability density of Ps
    being at any point x is the same positive constant for each x ∈ ES, and 0 elsewhere . Since E is
    purely (random) kinetic energy, E = Σ1Npi2/2m, where pi is the ith particle's momentum,
    so this energy surface is the set of all points with position coordinates within the unit cube in the
    position part of the phase space for SYS, and whose momentum coordinates are on the 3N-1
    dimensional sphere MS in momentum space centered at the origin with radius r = √(2mE). The
    area (or volume) where Ps might be is proportional to the area A of MS, and A ∝ r3N-1. The entropy
    S of SYS is proportional to ln(the area of phase space where Ps might be), S ∝ ln(A), therefore
    S = const1.+ const2. x ln(E), and since E ∝ T by the equipartition theorem, S = const1.+ const2. x
    [const3. + ln(T)]. Thus dS/dE ∝ dS /dT = const2. x 1/T, so dS/dE = const4. x 1/T, and choosing const4.
    to be 1, dS = dE/T. This shows the origin of your THRM dS law, for ideal gases (with dE = dq), which
    you probably knew. SM approximates this law, adequately for high T and so large A, by dividing
    phase space up into boxes with more-or-less arbitrary dimensions of position and momentum, and
    replacing A by the number NA of boxes which contain at least one point of ES. This makes S a
    function of T which is not even continuous, let alone differentiable, but for large T the jumps in NA,
    and so in S, as a function of T are small enough compared to S to ignore, and the SM entropy
    can approximately also follow the dS = dE/T law, and be about equal to the THRM entropy, for
    suitable box dimensions. However, as T approaches 0°K, the divergence of the SM entropy from the
    THRM entropy using these box dimensions becomes severe. As T decreases in steps by factors
    of, say, D, the THRM entropy S decreases by some constant amount ln(D) per step, becoming
    arbitrarily negative for low enough T, but with T never quite reaching 0°K by this process. For
    T = 0°K, A = 0, so S = (some positive) const. x ln(A) = const. x ln(0) = minus infinity. Since the
    energy surface ES must intersect at least one box of the SM partition of phase space, NA can never
    go below 1, no matter how small T and so A become. Thus the SM entropy S can never go below
    const. x ln(1) = 0. The THRM absolute entropy can be finite, except at T = 0, because, although Δ(S)
    from a state of SYS whose T is arbitrarily close to 0°K to a state at a higher T can be arbitrarily
    large (positive), S at the starting state can be negative enough that the resulting S for the state at
    the higher temperature is some constant finite number, regardless of how near 0°K the starting
    state is. For the SM entropy, a similar situation is not the case, since although the SM Δ(S) is
    about as large as the THRM Δ(S), the SM S at the starting state can never be less than 0. The
    temperature at which the SM entropy S gets stuck at 0, not being able to go lower for a lower T, is
    not a basic feature of the laws of the universe. Making SYS bigger or making the boxes of the
    SM partition of phase space smaller would result in the sticking temperature being lower, and
    of course making SYS smaller or the boxes larger would raise the sticking temperature.

    I have read somewhere (of course, maybe it was written by a low-temperature physicist) that the
    amount of interesting phenomena for a system within a range of temperatures is proportional to
    the ratio of the highest to the lowest temperature of that range, not to their difference. If so,
    there would be as large an amount of such phenomena between .001°K and .01°K as between
    100°K and 1000°K, but the usual SM entropy measure would show no entropy difference
    between any two states of a very small system in the lower temperature range, but a non-zero
    difference between different states in the upper range, so would be of no help in analyzing
    processes in the lower range, even though of some help in the upper range (or if not, for a given
    system, for these two temperature ranges, it would be so for some other two temperature ranges
    each with a 10 to 1 temperature ratio). On the other hand, the THRM entropy measure would show
    as much entropy difference (which would be non-zero) between states at the bottom and at the top
    of the lower range as between states at the bottom and at the top of the upper range.
     
  6. Dec 28, 2017 #66
    Actually, as an chemical engineer who worked on processes substantially above absolute zero, I have no interest in this whatsoever.
    The concept of entropy figures substantially in the practical application of thermodynamics to chemical process engineering, but not in the qualitative way that you describe. Entropy is part of the definition of Gibbs free energy which is essential to quantifying chemical reaction and phase equilibrium behavior in the design and operation of processes involving distillation, gas absorption, ion exchange, crystallization, liquid extraction, chemical reactors, etc.
    I personally have no interest in this, but other members might. Still, I caution you that Physics Forums encourages discussion only of mainstream theories, and specifically prohibits discussing personal theories. I ask you to start a new thread with what you want to cover (which seems tangential to the main focus of the present thread), possibly in the Beyond the Standard Model forum. You can then see whether anyone else has interest in this or whether it is just deemed a personal theory. I'm hoping that @DrDu and @DrClaude might help out with this.

    For now, I think that the present thread has run its course, and I'm hereby closing it.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Understanding Entropy and the 2nd Law of Thermodynamics - comments
  1. 2nd law- entropy (Replies: 12)

Loading...