Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Absolute entropy, definition, measurement, practical use

  1. Aug 2, 2006 #1
    Following a few threads here, I would be interrested this question was clarified as much as possible:

    How is the absolute entropy defined, and why ?
    How can it be measured ?
    Is it needed for applications, and when ? (examples ?)​

    Thanks to help clarify this fundation of thermodynamics.

    Last edited: Aug 2, 2006
  2. jcsd
  3. Aug 2, 2006 #2

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    The entropy of a substance compared to its entropy at absolute zero. The entropy at 0 deg. K is considered to be zero.
    I don't think it can be measured. It is calculated. You would need to know the heat capacity of the substance at all temperatures to determine the entropy change from absolute 0: ([itex]\Delta S = \int_0^TdQ/T = \int_0^TC_pdT/T[/itex]).

    I don't think absolute entropy has any practical use. Entropy tables for steam are widely used by engineers, but these do not use absolute entropy. They simply take an arbitrary state as 0 and compare entropy changes in moving to another state.

    Last edited: Aug 2, 2006
  4. Aug 3, 2006 #3


    User Avatar
    Homework Helper

    From the Clausius definition of entropy [itex]dS = \delta Q/T[/itex] (where [itex]\delta Q[/itex] is an inexact differential), we when integrate we obvious get an arbitray constant [itex]S_0[/itex] out of the deal, i.e. the entropy is defined up to that arbitray constant. To set an absolute scale so that we don't have to deal with differences in entropy all the time, we appeal to the 3rd law and define the entropy of a perfect crystal as being zero at absolute zero. This sets our (arbitrary) absolute entropy scale (at least in the phenomenolgical sense...).

    We can't directly measure entropy; we have equations that relate it to other properties which we can measure, so we can determine the entropy of the system that way. For instance, for a reversible system at constant temperature, we know [itex]\Delta S = \Delta Q/T[/itex]. So, we measure the heat removed or added to the system, and divide by the temperature of the system to find the change in entropy. I don't think we can get anything but the change in entropy out of this. If we want an absolute number, we have to define some sort of standard system which we're comparing our current system of study to (as I recall...). Despite being able to set an absolute scale for entropy, it's rather impracticle to actually use it, I think...

    As for applications... I can't think of any in which entropy is typically the end quantity in which we're interested in measuring... Which is unfortunate, because I rather like entropy and would be interested in measuring it. =P At any rate, entropy has certainly been measured; standard entropies (in J/K/mol) are given in the back of my introductory chemistry text, and steam tables for water have specific entropy (J/K/kg) vs. enthalpy, etc, curves are in the back of my thermal physics text. I imagine most of this information is used to calculate other quantities of interest, such as free energy, Gibbs free energy, etc. I suppose you could also use it to calculate response functions like heat capacities, but because you can't really measure entropy, people tend to use Maxwell relations to express response functions in variables that are easily measured directly.

    Entropy is certainly an important theoretical concept with grand implications, but I'm not so sure if measuring it represents much practical use (beyond experimental tests of theory, perhaps). There are still others, namely information theorists like E.T. Jaynes, who assert that entropy is not really a physical property of a system (I'm not entirely convinced of this as of yet), but rather a representation of our ignorance of a system, and using the principle of maximum entropy they can do fancy things like image reconstruction - but I don't know if that requires any use of actual values of the entropy itself. I don't want to say to much about information entropy and its relation to thermodynamic entropy, though, as I am not so clear on the details of the matter.

    I hope I haven't made any grevious errors there. Entropy and thermodynamics is one of the topics that continually keeps me thinking about it, so I would hope I've understood at least what I've explained above.
  5. Aug 4, 2006 #4

    The (isothermal) equilibrium of chemical reactions at high (enough) temperatures is determined by the variation of entropy. Indeed

    DG = DH - T DS​

    and for high temperatures the entropy term (DS) will dominate. This explains for example why transformation from solids to gas phase are preferred at higher temperatures.

    The variation of entropy can be calculated by cumulating from a reference temperature:

    DS(T) = DS(T0) + [DS(T) - DS(T0)]​

    The last term, [DS(T) - DS(T0)], can be calculated from specific heats and latent heats.
    Eventually, the first term can be taken at the absolute zero temperature.

    From this, clearly, we don't need the entropy S(0) of each reactant. But, clearly also, we need DS(0), the absolute reaction entropy at the absolute zero temperature. From the third law we must assume that DS(0) = 0. Therefore, the question of the absolute entropy for individual substance maybe somewhat theoretical but is seems practical at least for (absolute) reaction entropies.

    So, I am left with some difficulties:
    how can we be sure that DS(0) = 0
    and what is the physical basis
    should we take that as a principle (3rd principle), partly based on observations
    what are the observations
    and: can absolute entropy variations be measured, and how​

    Thanks already for your discussion,



    I find the Jaynes point-of-view quite natural.
    It is clear that the preparation of a system determines its behaviour.
    Therefore, by extension, entropy should not only be associated to a thermodynamic (equilibrium) state of a system but simply to any prepared state of system included the less prepared one: the thermodynamic equilibrium.
    For equilibrium thermodynamics, it is not surprising that the property of a system preparation (known, knowledge) becomes the property of a system.
    Last edited: Aug 4, 2006
  6. Aug 5, 2006 #5


    User Avatar
    Homework Helper

    Your notation is a tad confusing.

    From [itex]G = H - TS[/itex] it certainly follows that [itex]\Delta G = \Delta H - T\Delta S[/itex] at constant temperature, but [itex]\Delta (Variable)[/itex] is the difference between the variable at some state minus the variable's value at some other state. If [itex]\Delta S[/itex], etc, is what you mean by DS, then I'm somewhat uncertain as to what you mean by declaring that as a function of temperature. The only interpretation I can think of that makes sense to me is that

    DS(T) = S(T) - S(T') (and DS(T_0) = S(T_0) - S(T'))

    where T' is some reference temperature. If we take this reference temperature to be T' = 0, then DS(0) = S(0)-S(0) = 0.

    Again, entropy itself is defined only up to an arbitrary constant: [tex]S(T) = \int_{T'=0}^{T'=T}\frac{\delta Q}{T'} + S(0)[/tex]. We're free to define S(0) so that it is zero to make our lives easier, but another statement of the third law is that the entropy of a perfect crystal is zero at absolute zero, so it's not quite so arbitrary. I'm not sure how this comes about in phenomenological thermodynamics, but I would imagine that in statistical thermodynamics, there's only one macroscopically consistent way for the molecules of a perfect crystal to be aligned at T = 0, i.e. [itex]\Omega = 1[/itex] and by [itex]S = k_B \ln \Omega[/itex], we have S = 0. Of course, this doesn't hold in general - not all objects have zero entropy at absolute zero (this is called geometrical frustration).

    Changes in entropy (which again are what I assume you're referring to by "variations in entropy") must be measured indirectly by calculating it using various relations we have between entropy and other thermodynamic variables. We can't measure it directly.

    In terms of Jaynes's view, one of the issues I have with it is that we have functions with entropy as independent variables, for example, U(S,V,N). If S represents, in some sense, our ignorance of the state of the system, then this would seem to suggest that by changing what we know about the system we change the energy, U, of the system. Shouldn't this change in energy, an actual physical property of the system, come from something other than our change in knowledge? Certainly changing our knowledge of the system will change our knowledge of the value of U, but we like to assume that energy is a unique property of a system (within certain uncertainty limits), so if we discard entropy as a physical property in its own right, how do we retain energy as a physical property?
  7. Aug 5, 2006 #6

    Sorry for the lack of clarity.

    The variations I considered were variations occuring during an isothermal chemical reaction: for example DS = S(products) - S(reactants)

    During the reaction considered, the temperature is constant. Of course, the reaction can be studied at different temperatures. Then the variations of the thermodynamic functions with temperature come into play.

    So my questions was then as follows. Apparently the determination of the thermodynamic equilbrium does really depend on the entropy variation at 0°K ( S(products)(0°K) - S(reactants)(0°K) ) . The 3rd law amounts to setting this term to zero. So, this is quite an important statement. And the 3rd law seems important for the compilation of data for practical chemistry. Knowing the theoretical idea behind (atoms and their arrangements), I find it impressive to see such a very theoretical statement playing a role in data that are of such practical importance: the whole chemical industry at least. Therefore, I would like to be sure about that. Eventually, I would appreciate to understand this historically: I guess that chemical equilibrium has been studied extensively and documented probably before the 3rd law took its place in thermodynamics.

    Concerning Jaynes point of view. It is clear that the relation U(S,V,N) does not exist in general, but it exists for systems at thermodynamic equilibrium. For example, it is possible to prepare a system in a very improbable way but with a very high energy. Maybe an (filtered)electron beam in an accelerator is such a system.

    Last edited: Aug 5, 2006
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Absolute entropy, definition, measurement, practical use
  1. Entropy definition (Replies: 22)

  2. Definition of Entropy (Replies: 5)