1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy of an isolated system

  1. Apr 19, 2015 #1
    1. The problem statement, all variables and given/known data
    There is a line in a question:
    "Entropy of an isolated system is always maximised at equilibrium."
    And it is given true.
    But Why?

    2. Relevant equations

    3. The attempt at a solution
    In an isolated system heat input is equal to zero. And we know that entropy= heat input/Temperature.
    Therefore entropy should be zero . But here its given that it is maximum, which means it is changing throughout the reaction. But according to me it should be zero throughout. So where am I wrong?
  2. jcsd
  3. Apr 19, 2015 #2


    User Avatar
    Science Advisor
    Gold Member

    The modern definition of the entropy [itex]S[/itex] is the statistical definition of entropy;
    [itex]S\equiv k_B \log(W)[/itex],
    where [itex]W[/itex] is the number of "ways" that the elementary parts of the system (its atoms and energy quanta and such) can be arranged so that the total system has the same macroscopic properties (i.e., total energy, total volume, total particle number, etc), and [itex]k_B[/itex] is Boltzmann's constant to convert from units of energy to temperature.

    The reason that a system in thermal equilibrium has maximum entropy, comes from the very nature of what thermal equilibrium is:
    Thermal equilibrium is reached when the system has a statistically stable distribution of energy that doesn't change with time.

    There are many ways to distribute energy throughout the atoms of a system (some atoms could have more energy than others, even by chance)
    Over time, all possible distributions of energy will be equally likely to occur in a closed system (this is called the ergodic hypothesis).

    However there are vastly many more ways of of distributing energy in a closed system nearly uniformly, than in any other way.
    Because the entropy is a measure of the number of ways of distributing energy (as well as the atoms and photons carrying it), the equilibrium distribution is the one with maximum entropy.

    If an isolated system undergoes a reaction, then it was not in equilibrium to start with.

    The system reaches equilibrium only when the reaction reaches equilibrium (when just as many reactants are being converted into products as vise versa).
  4. Apr 19, 2015 #3
    I think what your statement means is:
    There are maximum no. Of possible distributions at equilibrium , therefore it has the maximum entropy at equilibrium.. but should we disregard the equation ΔS=qrev/T here completely? Is it wrong here?
  5. Apr 19, 2015 #4
    I got a similar similar statement which says:
    ΔSsystem for irreversible adiabatic compression is greater than zero.
    Here the same problem stands in my way.
  6. Apr 20, 2015 #5
    The integral of dq/T is equal to the entropy change only if the process is reversible. If the process in your isolated system is not reversible, you need to dream up a process that takes your system between the same initial and final thermodynamic equilibrium states, and is reversible. You then evaluate the integral of dq/T for that path to get the entropy change. That path may be quite different from the irreversible path. The same goes for your question about an irreversible adiabatic compression. Here are some notes that I wrote up about the first and second laws of thermodynamics that may help:


    Suppose that we have a closed system that at initial time ti is in an initial equilibrium state, with internal energy Ui, and at a later time tf, it is in a new equilibrium state with internal energy Uf. The transition from the initial equilibrium state to the final equilibrium state is brought about by imposing a time-dependent heat flow across the interface between the system and the surroundings, and a time-dependent rate of doing work at the interface between the system and the surroundings. Let [itex]\dot{q}(t)[/itex] represent the rate of heat addition across the interface between the system and the surroundings at time t, and let [itex]\dot{w}(t)[/itex] represent the rate at which the system does work on the surroundings at the interface at time t. According to the first law (basically conservation of energy),
    [tex]\Delta U=U_f-U_i=\int_{t_i}^{t_f}{(\dot{q}(t)-\dot{w}(t))dt}=Q-W[/tex]
    where Q is the total amount of heat added and W is the total amount of work done by the system on the surroundings at the interface.

    The time variation of [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] between the initial and final states uniquely characterizes the so-called process path. There are an infinite number of possible process paths that can take the system from the initial to the final equilibrium state. The only constraint is that Q-W must be the same for all of them.

    If a process path is irreversible, then the temperature and pressure within the system are inhomogeneous (i.e., non-uniform, varying with spatial position), and one cannot define a unique pressure or temperature for the system (except at the initial and the final equilibrium state). However, the pressure and temperature at the interface can be measured and controlled using the surroundings to impose the temperature and pressure boundary conditions that we desire. Thus, TI(t) and PI(t) can be used to impose the process path that we desire. Alternately, and even more fundamentally, we can directly control, by well established methods, the rate of heat flow and the rate of doing work at the interface [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex]).

    Both for reversible and irreversible process paths, the rate at which the system does work on the surroundings is given by:
    where, again, PI(t) is the pressure at the interface with the surroundings, and where [itex]\dot{V}(t)[/itex] is the rate of change of system volume at time t. However, if the process path is reversible, the pressure P within the system is uniform, and

    [itex]P_I(t)=P(t)[/itex] (reversible process path)

    Therefore, [itex]\dot{w}(t)=P(t)\dot{V}(t)[/itex] (reversible process path)

    Another feature of reversible process paths is that they are carried out very slowly, so that [itex]\dot{q}(t)[/itex] and [itex]\dot{w}(t)[/itex] are both very close to zero over then entire process path. However, the amount of time between the initial equilibrium state and the final equilibrium state (tf-ti) becomes exceedingly large. In this way, Q-W remains constant and finite.


    In the previous section, we focused on the infinite number of process paths that are capable of taking a closed thermodynamic system from an initial equilibrium state to a final equilibrium state. Each of these process paths is uniquely determined by specifying the heat transfer rate [itex]\dot{q}(t)[/itex] and the rate of doing work [itex]\dot{w}(t)[/itex] as functions of time at the interface between the system and the surroundings. We noted that the cumulative amount of heat transfer and the cumulative amount of work done over an entire process path are given by the two integrals:
    In the present section, we will be introducing a third integral of this type (involving the heat transfer rate [itex]\dot{q}(t)[/itex]) to provide a basis for establishing a precise mathematical statement of the Second Law of Thermodynamics.

    The discovery of the Second Law came about in the 19th century, and involved contributions by many brilliant scientists. There have been many statements of the Second Law over the years, couched in complicated language and multi-word sentences, typically involving heat reservoirs, Carnot engines, and the like. These statements have been a source of unending confusion for students of thermodynamics for over a hundred years. What has been sorely needed is a precise mathematical definition of the Second Law that avoids all the complicated rhetoric. The sad part about all this is that such a precise definition has existed all along. The definition was formulated by Clausius back in the 1800's.

    Clausius wondered what would happen if he evaluated the following integral over each of the possible process paths between the initial and final equilibrium states of a closed system:
    where TI(t) is the temperature at the interface with the surroundings at time t. He carried out extensive calculations on many systems undergoing a variety of both reversible and irreversible paths and discovered something astonishing. He found that, for any closed system, the values calculated for the integral over all the possible reversible and irreversible paths (between the initial and final equilibrium states) was not arbitrary; instead, there was a unique upper bound (maximum) to the value of the integral. Clausius also found that this result was consistent with all the "word definitions" of the Second Law.

    Clearly, if there was an upper bound for this integral, this upper bound had to depend only on the two equilibrium states, and not on the path between them. It must therefore be regarded as a point function of state. Clausius named this point function Entropy.

    But how could the value of this point function be determined without evaluating the integral over every possible process path between the initial and final equilibrium states to find the maximum? Clausius made another discovery. He determined that, out of the infinite number of possible process paths, there existed a well-defined subset, each member of which gave the same maximum value for the integral. This subset consisted of what we call today the reversible process paths. So, to determine the change in entropy between two equilibrium states, one must first dream up a reversible path between the states and then evaluate the integral. Any other process path will give a value for the integral lower than the entropy change.

    So, mathematically, we can now state the Second Law as follows:

    [tex]I=\int_{t_i}^{t_f}{\frac{\dot{q}(t)}{T_I(t)}dt}\leq\Delta S=\int_{t_i}^{t_f} {\frac{\dot{q}_{rev}(t)}{T(t)}dt}[/tex]
    where [itex]\dot{q}_{rev}(t)[/itex] is the heat transfer rate for any of the reversible paths between the initial and final equilibrium states, and T(t) is the system temperature at time t (which, for a reversible path, is equal to the temperature at the interface with the surroundings). This constitutes a precise mathematical statement of the Second Law of Thermodynamics.

  7. Apr 20, 2015 #6
    So what I understood from your notes is that if a process is irreversible , we may dream up a path which is reversible but has the same starting and ending points of that irreversible path , to find the entropy change of that irreversible path. Am I correct here ?
  8. Apr 20, 2015 #7
    Yes. Spot on.

  9. Apr 20, 2015 #8
    But still I cant relate the discussion in the problem of irreversible adiabatic compression(i.e. why is it spontaneous?). I supposed that this reaction is reversible by kicking irreversible out. So the new process with same end points is called reversible adiabatic process. But now still I don't know what to do of this dq/T integral , since the process is still adiabatic and has q=0. So what to do now?
  10. Apr 20, 2015 #9
    You can't get between the same two end points by a reversible adiabatic process, if the end points correspond to an irreversible adiabatic process. Any reversible process between the same two end points can not be completely adiabatic.

    If you are interested, here is a thread that worryingchem and I have been working on that you may want to follow:


    Start with post #15. We just finished working on the problem of irreversible expansion for the isothermal case, and we have just begun to work on the analysis of an irreversible adiabatic expansion.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted

Similar Discussions: Entropy of an isolated system
  1. Entropy change (Replies: 0)

  2. Thermodynamics Entropy (Replies: 1)