Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Insights Understanding Entropy and the 2nd Law of Thermodynamics - comments

  1. Apr 30, 2015 #1
  2. jcsd
  3. Apr 30, 2015 #2
    Awesome first entry chestermiller!
  4. Apr 30, 2015 #3
    Thanks Chester.
    Yes. I really did find that clear.

    Which is not to say I understood it...

    What is special about the reversible paths? Are all the other paths, the non-reversible ones, the same, or do some integrate to different values <= DeltaS than others?
    Last edited: Apr 30, 2015
  5. Apr 30, 2015 #4


    User Avatar
    Science Advisor
    Gold Member

    Kudos chestermiller. That was clear, and understandable, The historical perspective really helped.

    I look forward to the day when chestermiller makes it similarly easy to understand why this second law implies that "the entopy of the universe tends to a maximum". And how it relates to the kind of information debated in the Hawking/Susskind "black hole wars."
  6. Apr 30, 2015 #5
    Thanks Jimster.

    Reversible paths minimize the dissipation of mechanical energy to thermal energy, and maximize the ability of temperature differences to be converted into mechanical energy. In reversible paths, the pressure exerted by the surroundings at the interface with the system is only slightly higher of lower than the pressure throughout the system, and the temperature at the interface with the surroundings is only slightly higher or lower than the temperature throughout the system. This situation is maintained over the entire path from the initial to the final equilibrium state of the system.

    For irreversible paths, the dissipation of mechanical energy to thermal energy is the result of viscous dissipation. The same thing happens if you compress a combination of a spring and (viscous) damper connected in parallel. If you compress the combination very rapidly from an initial length to a final length, you generate lots of heat in the damper (since the force carried by the damper is proportional to the velocity difference across the damper). On the other hand, if you compress the combination very slowly, the force carried by the damper is much less, and you generate much less heat. The amount of work you need to do in the latter case to bring about the compression is also much less. This is a very close analogy to what happens when you cause a gas in a cylinder to compress.
    They integrate to different values < ΔS. The equal sign does not apply to irreversible paths. They are all less.

  7. Apr 30, 2015 #6
    Thanks Chet, I hope it's okay if I keep asking you questions. It really is my favorite way to learn, and I can get enough of the second law, and I'm sure I will learn something - not the least of which will be precision of terms.

    In the the case of a gas in a cylinder with a piston (aka "the damper") why does the amount of dissipation vary with the amount of force per unit time? what does the time rate of force have to who with the efficiency of conversion to mechanical energy? Why does the difference at any given time between the system and surroundings, dictate the reversibility, as opposed to say the amount of energy transferred altogether?
    Last edited: Apr 30, 2015
  8. Apr 30, 2015 #7
    Hi Jimster. You ask great questions.

    Why don't you introduce this in a separate thread, and we'll work through it together? First we'll consider the spring/damper system to get an idea of how a difference between a rapid deformation and a very slow deformation (between the same initial and final states) plays out in terms of mechanical energy dissipated in the damper and work done. The idea is for you to get a feel for how this works.

  9. May 1, 2015 #8
    Thanks Chet. Your explanation of the "Clausius Inequality" and you answer on the difference between reversible and non-reversible paths were helpful and lucid, and It means a lot to know they are at least sensible questions.

    I don't suppose @techmologist and I could get your help reading an old Galvin Crooks paper from 1999 on the "generalized fluctuation theorem"? We've got a thread going in the cosmology forum. PeterDonis has been helping us (humoring us more like it). It's under @techmologists question "why are there heat engines?" It's pretty rambly at this point so I would be more than happy to restart it focusing it back on Crooks' paper and handful of equations, and drill in with your guidance.
  10. May 1, 2015 #9
    I'll take a look and see if I can contribute. There are lots of pages and lots of posts, so it may take me a while to come up to speed. No guarantees.

  11. May 2, 2015 #10
    Hello Chestermiller.

    "There have been nearly as many formulations of the second law as there have been discussions of it."

    ~P. W. Bridgman

    Entropy and the Second Law of Thermodynamics is not exactly an intuitive concept. While I think your article is basically a good one, it is obviously somewhat limited in scope, and my only critique is that you did not cover some of the most important aspects of entropy.

    I agree that most people have a very hard time grasping entropy and the second law of thermodynamics. But I am not sure I understand why your article keeps referring to reversible processes and adiabatic idealizations. In natural systems, the entropy production rate of every process is always positive (ΔS > 0) or zero (ΔS = 0). But only idealized adiabatic (perfectly insulated) and isentropic (frictionless, non-viscous, pressure-volume work only) processes actually have an entropy production rate of zero. Heat is produced, but not entropy. In nature, this ideal can only be an approximation, because it requires an infinite amount of time and no dissipation.

    You hardly mention irreversible processes. An irreversible process degrades the performance of a thermodynamic system, and results in entropy production. Thus, irreversible processes have an entropy production rate greater than zero (ΔS > 0), and that is really what the second law is all about (beyond the second law analysis of machines or devices). Every naturally occurring process, whether adiabatic or not, is irreversible (ΔS > 0), since friction and viscosity are always present.

    Here is my favorite example of an irreversible thermodynamic process, the Entropy Rate Balance Equation for Control Volumes:


    And here are are a couple of other important things you did not mention about entropy:

    1) Entropy is a measure of molecular disorder in a system. According to Kelvin, a pure substance at absolute zero temperature is in perfect order, and its entropy is zero. This is the less commonly known Third Law of Thermodynamics.

    2) "A system will select the path or assemblage of paths out of available paths that minimizes the potential or maximizes the entropy at the fastest rate given the constraints." This is known as the Law of Maximum Entropy Production. "The Law of Maximum Entropy Production thus has deep implications for evolutionary theory, culture theory, macroeconomics, human globalization, and more generally the time-dependent development of the Earth as a ecological planetary system as a whole." http://www.lawofmaximumentropyproduction.com/

    And apparently, I just got another trophy since this is my first post!
    Last edited: May 2, 2015
  12. May 2, 2015 #11
    No problem Chet, don't feel obliged. Might be just as well if you were to do something you thought would be most helpful, rather than follow us down a rabbit hole. This is the Crooks paper


    The Entropy Production Fluctuation Theorem and the Nonequilibrium Work Relation for Free Energy Differences
    Gavin E. Crooks
    (Submitted on 29 Jan 1999 (v1), last revised 29 Jul 1999 (this version, v4))
    There are only a very few known relations in statistical dynamics that are valid for systems driven arbitrarily far-from-equilibrium. One of these is the fluctuation theorem, which places conditions on the entropy production probability distribution of nonequilibrium systems. Another recently discovered far-from-equilibrium expression relates nonequilibrium measurements of the work done on a system to equilibrium free energy differences. In this paper, we derive a generalized version of the fluctuation theorem for stochastic, microscopically reversible dynamics. Invoking this generalized theorem provides a succinct proof of the nonequilibrium work relation.

    I'm interested in the fluctuation theorem, just understanding it really, it seems to underpin the second law? What I liked about Crooks formulation is that I thought I could see more how "entropy" path selection and work are related. But I have little confidence I understand it.
  13. May 2, 2015 #12
    Thanks INFO_MAN. It's nice to be appreciated.

    Yes. You are correct. I deliberately limited the scope. Possibly you misconstrued my objective. It was definitely not to write a treatise on entropy and the 2nd law. I was merely trying to give beginning thermodynamics students who are struggling with the basic concepts the minimum understanding they need just to do their homework. As someone relatively new to Physics Forums, you may not be aware of the kinds of questions we get from novices. Typical of a recurring question is: How come the entropy change is not zero for an irreversible adiabatic process if the change in entropy is equal to the integral of dq/T and dq = 0? Homework problems frequently involve irreversible adiabatic expansion or compression of an ideal gas in a cylinder with a piston. Students are often asked to determine the final equilibrium state of the system, and the change in entropy. You can see where, if they were asking questions like the the previous one, how they would have trouble doing a homework problem like this.

    My original introduction to the tutorial was somewhat longer than in the present version, and spelled out the objectives more clearly. However, the guidelines that Physics Forums set a goal of about 400 words for the Insight articles, and the present version of my article is well over 1000 words. Here is the introductory text that I cut out:

    In this author's judgement, the primary cause of the (students') confusion is the poor manner in which these concepts are taught in textbooks and courses.

    The standard approach is to present the chronological development of the subject in a straight line from beginning to end. Although this is the way that the subject had developed historically, it is not necessarily the best way to teach the subject. It is much more important for the students to gain a solid understanding of the material by whatever means possible than to adhere to a totally accurate account of the chronological sequence. Therefore, in the present document, we have created a somewhat fictionalized account of the historical sequence of events in order to minimize the historical discussion, focus more intently on the scientific findings, and make the concepts clearer and less confusing to students.

    Another shortcoming of existing developments is that the physical situations they discuss are not specified precisely enough, and the mathematical relationships likewise lack proper constraint on their applicability and limitations (particularly the so-called Clausius Inequality). There is also a lack a concise mathematical statement of the second law of thermodynamics in such a way that it can be confidently applied to practical situations and problem solving. In the present development, we have endeavored to overcome these shortcomings.

    This is an example of one of those instances I was referring to in which the constraints on the equations is not spelled out clearly enough, and, as a result, confusion can ensue. The situation you are referring to here with the inequality (ΔS > 0) and equality (ΔS = 0) applies to the combination of the system and the surroundings, and not just to a closed system. Without this qualification, the student might get the idea that for a closed system, ΔS≥0 always, which is, of course, not the case.

    Even though reversible processes are an idealization, there is still a need for beginners to understand them. First of all they provide an important limiting case with which irreversible processes can be compared. In geometry, there is no such thing as a perfect circle, a perfect rectangle, a perfect square, etc., but yet we still study them and apply their concepts in our work and lives. Secondly, some of the processes that occur in nature and especially in industry can approach ideal reversible behavior. Finally, and most importantly, reversible processes are the only vehicle we have for determining the change in entropy between two thermodynamic equilibrium states of a system or material.
    I'm sorry that impression came through to you because that was not my intention. I feel that it is very important for students to understand the distinction between real irreversible processes paths and ideal reversible process paths. Irreversible process paths are what really happens. But reversible process paths are what we need to use to get the change in entropy for a real irreversible process path.
    This equation applies to the more general case of an open system for which mass is entering and exiting, and I was trying to keep things simple by restricting the discussion to closed systems. Also, entropy generation can be learned by the struggling students at a later stage.

    As I said above, I was trying to limit the scope exclusively to what the beginning students needed to understand in order to do their homework.

  14. May 2, 2015 #13
    That was great Chet. It helps to know the purpose and scope. Hey, can you explain to a confused student why the change in entropy in a closed sytem is not always greater than or equal to 0? I think I know (Poincare' recurrence?) but I also think I'm probably wrong.
  15. May 2, 2015 #14
    Suppose you compress a gas isothermally and reversibly in a closed system. To hold the temperature constant, do you have to add heat or remove heat? After you compress the gas to a smaller volume at the same temperature, are the number of quantum states available to it greater or fewer?

    You are aware that, in thermodynamics, there is a difference between a closed system and an isolated system, correct?

    Last edited: May 2, 2015
  16. May 2, 2015 #15
    No sir, I was not clear on that precise difference of terms! Now I am. I believe you need to remove heat. Hmm, the quantum states. That one really makes me think, with great confusion, which is not good, since the answer should probably be obvious. In the closed sytem that has been isothermically compressed (heat removed), I would say the number of states is fewer? But it 's basically a guess. I don't know how to decompose that question, with any confidence. I think I know something about the parts, but probably have way too many questions and misconceptions tangled up in it. Please do illuminate!

    I say fewer because the volume is less, and so the available "locations" are reduced. But this does not seem very satisfactory, right, or clear.
    Last edited: May 2, 2015
  17. May 2, 2015 #16
    Both your answers are correct. You remove heat from the system in an isothermal reversible compression, so ΔS < 0 (q is negative). The number of states available to the system is fewer, so, by that criterion also, ΔS < 0.

    A closed system is one that cannot exchange mass with its surroundings, but it can exchange heat and mechanical energy (work W). An isolated system is one that can exchange neither mass, heat, nor work.

  18. May 2, 2015 #17
    I find your "temperature at the interface with the surroundings" confusing in that to me it implies an average temperature between the system and the surroundings at that point. Would it be more clear to say "temperature of the surroundings at the interface" or am I missing something?
  19. May 2, 2015 #18
    What you're missing is that, at the interface, the local temperature of the system matches the temperature of the surroundings. There is no discontinuity in temperature (or in force per unit area) at the interface. However, in an irreversible process, the temperature within the system varies with distance from the interface.

    Last edited: May 2, 2015
  20. May 2, 2015 #19
    So you're saying there is a temperature gradient between the "bulk" system and the interface, but no temperature gradient between the "bulk" surroundings and the interface?
  21. May 2, 2015 #20
    Yes. With an irreversible process, there is a temperature difference between the average temperature in the system and the temperature at the interface. However, at the very interface, the local system temperature matches the local surroundings temperature.
    Not necessarily. I've tried to get us focused primarily on the system. I'm assuming that we are not concerning ourselves with the details of what is happening within the surroundings, except at the interface, where we are assuming that either the heat flux or the temperature is specified. (Of course, more complicated boundary conditions can also be imposed, and are included within the framework of our methodology). Thus, the "boundary conditions" for work and heat flow on the system are applied at the interface.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted

Similar Discussions: Understanding Entropy and the 2nd Law of Thermodynamics - comments
  1. 2nd law- entropy (Replies: 12)