1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy and temperature

  1. Dec 10, 2014 #1
    I am unable to grasp why is entropy inversely proportional to temperature. My book says that "Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature." What is meant by this statement?
     
  2. jcsd
  3. Dec 10, 2014 #2

    Bystander

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    How is entropy defined in your textbook?
     
  4. Dec 10, 2014 #3
    The standard way : The amount of randomness of a system. Then it goes on to say that if we add more heat the particles will be flying around more quickly and thus their randomness would increase. So, S proportional to Q. And then it says, "Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature." and so S is inversely proportional to T.
     
  5. Dec 10, 2014 #4

    Bystander

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Entropy is defined as the integral of dq/T from absolute zero to the temperature of the system; it was defined that way nearly two centuries ago; and, it will be that for a long time to come. For exchange of heat between a system and its surroundings at some constant T, the change in entropy is then just Q/T. For a fixed value of Q, the higher the temperature, the lower the entropy change and the lower the value of T, the higher the magnitude of change in entropy.
     
  6. Dec 10, 2014 #5
    I think what the OP is asking is "how does the statistical thermo relationship for entropy (or entropy change) reconcile with the classical thermo relationship ∫dq/T?"

    Chet
     
  7. Dec 10, 2014 #6

    anorlunda

    Staff: Mentor

    Bystander is correct, but I sympathize with the student and the teacher. The Wikipedia entry on entropy disambiguation, lists 36 definitions of entropy including 13 within the context of thermodynamics. Of course, when digging deeper, these definitions are not contradictory. Still, I can't think of any other scientific term so confusing as entropy.
     
  8. Dec 10, 2014 #7

    Bystander

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Had there been mention of "partition function(s)" rather than the word "randomness" I might have pursued that line of discussion. Being something of a "thermosaur," it struck me as more useful to stick to the original definition and attempt to clarify that for the OP.
     
  9. Dec 11, 2014 #8
    I am still confused.
     
  10. Dec 11, 2014 #9
    Maybe this write up write up will help:

    FIRST LAW OF THERMODYNAMICS

    Suppose that we have a closed system that at initial time ti is in an initial equilibrium state, with internal energy Ui, and at a later time tf, it is in a new equilibrium state with internal energy Uf. The transition from the initial equilibrium state to the final equilibrium state is brought about by imposing a time-dependent heat flow across the interface between the system and the surroundings, and a time-dependent rate of doing work at the interface between the system and the surroundings. Let [itex]\dot{q}(t)[/itex] represent the rate of heat addition across the interface between the system and the surroundings at time t, and let [itex]\dot{w}(t)[/itex] represent the rate at which the system does work on the surroundings at the interface at time t. According to the first law (basically conservation of energy),
    [tex]\Delta U=U_f-U_i=\int_{t_i}^{t_f}{(\dot{q}(t)-\dot{w}(t))dt}=Q-W[/tex]
    where Q is the total amount of heat added and W is the total amount of work done by the system on the surroundings at the interface.

    The time variation of [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] between the initial and final states uniquely characterizes the so-called process path. There are an infinite number of possible process paths that can take the system from the initial to the final equilibrium state. The only constraint is that Q-W must be the same for all of them.

    If a process path is irreversible, then the temperature and pressure within the system are inhomogeneous (i.e., non-uniform, varying with spatial position), and one cannot define a unique pressure or temperature for the system (except at the initial and the final equilibrium state). However, the pressure and temperature at the interface can be measured and controlled using the surroundings to impose the temperature and pressure boundary conditions that we desire. Thus, TI(t) and PI(t) can be used to impose the process path that we desire. Alternately, and even more fundamentally, we can directly control, by well established methods, the rate of heat flow and the rate of doing work at the interface [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex]).

    Both for reversible and irreversible process paths, the rate at which the system does work on the surroundings is given by:
    [tex]\dot{w}(t)=P_I(t)\dot{V}(t)[/tex]
    where [itex]\dot{V}(t)[/itex] is the rate of change of system volume at time t. However, if the process path is reversible, the pressure P within the system is uniform, and

    [itex]P_I(t)=P(t)[/itex] (reversible process path)

    Therefore, [itex]\dot{w}(t)=P(t)\dot{V}(t)[/itex] (reversible process path)

    Another feature of reversible process paths is that they are carried out very slowly, so that [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] are both very close to zero over then entire process path. However, the amount of time between the initial equilibrium state and the final equilibrium state (tf-ti) becomes exceedingly large. In this way, Q-W remains constant and finite.

    SECOND LAW OF THERMODYNAMICS

    In the previous section, we focused on the infinite number of process paths that are capable of taking a closed thermodynamic system from an initial equilibrium state to a final equilibrium state. Each of these process paths is uniquely determined by specifying the heat transfer rate [itex]\dot{q}(t)[/itex] and the rate of doing work [itex]\dot{w}(t)[/itex] as functions of time at the interface between the system and the surroundings. We noted that the cumulative amount of heat transfer and the cumulative amount of work done over an entire process path are given by the two integrals:
    [tex]Q=\int_{t_i}^{t_f}{\dot{q}(t)dt}[/tex]
    [tex]W=\int_{t_i}^{t_f}{\dot{w}(t)dt}[/tex]
    In the present section, we will be introducing a third integral of this type (involving the heat transfer rate [itex]\dot{q}(t)[/itex]) to provide a basis for establishing a precise mathematical statement of the Second Law of Thermodynamics.

    The discovery of the Second Law came about in the 19th century, and involved contributions by many brilliant scientists. There have been many statements of the Second Law over the years, couched in complicated language and multi-word sentences, typically involving heat reservoirs, Carnot engines, and the like. These statements have been a source of unending confusion for students of thermodynamics for over a hundred years. What has been sorely needed is a precise mathematical definition of the Second Law that avoids all the complicated rhetoric. The sad part about all this is that such a precise definition has existed all along. The definition was formulated by Clausius back in the 1800's.

    Clausius wondered what would happen if he evaluated the following integral over each of the possible process paths between the initial and final equilibrium states of a closed system:
    [tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}[/tex]
    where TI(t) is the temperature at the interface with the surroundings at time t. He carried out extensive calculations on many systems undergoing a variety of both reversible and irreversible paths and discovered something astonishing. He found that, for any closed system, the values calculated for the integral over all the possible reversible and irreversible paths (between the initial and final equilibrium states) was not arbitrary; instead, there was a unique upper bound (maximum) to the value of the integral. Clausius also found that this result was consistent with all the "word definitions" of the Second Law.

    Clearly, if there was an upper bound for this integral, this upper bound had to depend only on the two equilibrium states, and not on the path between them. It must therefore be regarded as a point function of state. Clausius named this point function Entropy.

    But how could the value of this point function be determined without evaluating the integral over every possible process path between the initial and final equilibrium states to find the maximum? Clausius made another discovery. He determined that, out of the infinite number of possible process paths, there existed a well-defined subset, each member of which gave the same maximum value for the integral. This subset consisted of what we call today the reversible process paths. So, to determine the change in entropy between two equilibrium states, one must first conceive of a reversible path between the states and then evaluate the integral. Any other process path will give a value for the integral lower than the entropy change.

    So, mathematically, we can now state the Second Law as follows:

    [tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}\leq\Delta S=\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/tex]
    where [itex]\dot{q}_{rev}(t)[/itex] is the heat transfer rate for any of the reversible paths between the initial and final equilibrium states, and T(t) is the system temperature at time t (which, for a reversible path, is equal to the temperature at the interface with the surroundings). This constitutes a precise mathematical statement of the Second Law of Thermodynamics.
     
  11. Dec 11, 2014 #10
    Thanks for the detailed post. But was entropy defined as follows just because someone found something interesting while evaluating an integral? I want to know intuitively why is it inversely proportional to the temperature at which it is added.
     
  12. Dec 11, 2014 #11

    anorlunda

    Staff: Mentor

    Yashbhatt, I believe that the source of your confusion is that you envision increasing temperature while holding everything else constant. That would make entropy decrease. But putting heat in increases both Q and T (making dQ positive), and it is the integral of the ratio dQ to T, that is entropy.
     
  13. Dec 11, 2014 #12
    I don't know how to answer this except to say that the Second Law evolved as a consequence of a huge body of empirical observations. The thermodynamic definition of entropy followed directly from these observations (so that they could be captured in a concise mathematial form) and from the recognition that a point thermodynamic function of this form must exist.

    Also note that, if dQ/T is being used to evaluate entropy change between two thermodynamic states, the path between these two states must be reversible. Not just any path will do.

    Chet
     
  14. Dec 11, 2014 #13
    But what if it is defined normally without integral. That's the way it is defined in my book.

    My teacher told me something about phase change. She said that if heat is added to water at a higher temperature then it will vaporize which will cause less entropy increase if the heat were to be added such that the state would remain the same.
    Is that relevant?
     
  15. Dec 11, 2014 #14
    Although it is typical to come across the thermodynamics (classical) definition of entropy before the statistical mechanics (quantum) definition of entropy, the statistical mechanics definition is clearer and helps to understand where the thermodynamics definition comes from. In statistical mechanics, the entropy is
    ##S = k_B \log n##, where ##n## is the number of microscopic quantum states available to a system with known bulk properties (such as total energy, volume), and ##k_B## is just a constant used to match the definition to the classical thermodynamics definition.

    Basically, with a macroscopic sized system, (say a gas in a cylinder), you typically have the total energy relatively constant, but you don't know the energy of each particle since they are constantly colliding with each other and trading energy. The entropy tells you how many configurations of particle energies there are for a given total energy (and volume, etc...).

    You also need the definition of temperature, which is less obvious than you think. In thermodynamics, the definition is
    ##\frac{1}{T} = \left(\frac{\partial{S}}{\partial{U}}\right)_{V,N}##
    Using this definition, the answer to your original question is quite obvious. You might have come across a definition of temperature as an average energy per degree of freedom, such that ##\left<E\right>= k_B T/2##. This can be derived from above using kinetic theory, but this definition is not as fundamental and requires some idealized assumptions.
     
  16. Jan 27, 2015 #15
    Now, I have got some vague idea about this thing after reading this answer on Quora(http://www.quora.com/Why-does-entropy-depend-on-the-temperature-at-which-heat-is-added-to-the-system [Broken]):

    Suppose, if entropy were defined just involving heat, then there would be heat flow when there were a heat difference and not a temperature difference. But it is not so because the substances have different heat capacities and so there needs to be a temperature difference. So, we need a term that involves T.

    Moreover, the heat capacity(or specific heat) of a substance increases with temperature and so the temperature rise when heat is added at a higher temperature is less as compared when it is added at a lower temperature.

    Is this correct?
     
    Last edited by a moderator: May 7, 2017
  17. Jan 27, 2015 #16
    Personally, to me, it doesn't make sense.

    Chet
     
    Last edited by a moderator: May 7, 2017
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook