Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Change in entropy

  1. Dec 25, 2012 #1
    "change" in entropy

    While reading a textbook on introductory thermodynamics , I came across the following-

    "When a system is in equilibrium, the entropy is maximum and the change in entropy ΔS is zero "
    And also

    "We can say that for a spontaneous process, entropy increases till it reaches a maximum, at equilibrium where the change in entropy is zero "

    (here entropy refers to total entropy, ie system plus surroundings)

    I fail to understand how one can define "change" for an instant.
    Like "change of entropy is zero at equilibrium". To define change we need to compare two different states. In this case, equilibrium is one of the states. What is the other state to which it is being compared to? Is it the initial state of the system?

    How do we calculate this "change in entropy", at various instants of the process? Can we write it as a function of time?

    (I have the same problem with free energy, they always say "change in free energy is zero at equilibrium")
     
  2. jcsd
  3. Dec 25, 2012 #2
    Re: "change" in entropy

    It's not just 'change for an instant'.....try reading the first two sections here:

    http://en.wikipedia.org/wiki/Second_law_of_thermodynamics

    and see if that clarifies it for you. See the first formula.

    but be prepared, 'entropy' is very tricky!

    Here is one idea I did not uncover for quite a while:

    The relationship between entropy and information is subtle and complex.

    Suppose I give you a box of gas and ask you what you think the distribution of the gas is. A logical guess is equally dispersed, right? That would not be a surprising answer.....the gas diffuses and reaches an equilibrium [maximum entropy] unless disturbed.

    Now let's put it in a really strong gravitational field and give the field time to reach equilibrium: now the "most likely" state would be "clumpy", maybe like the universe....again entropy is maximum....


    [I don't think anybody really understands entropy yet....like gravity, maybe....That's why John von Neumann suggested to Claude Shannon when Shannon was developing information theory at Bell Labs he use entropy instead of "uncertainty" in explanations....opponents would be intimidated because nobody knows what 'entropy' is!!!!! ]
     
    Last edited: Dec 25, 2012
  4. Dec 25, 2012 #3
    Re: "change" in entropy

    They meant the "rate of change of entropy", not the "change in entropy". At thermal equilibrium, the first derivative of entropy with respect to time is zero.

    In other words,
    0=dS/dt,
    where S is the entropy of the system at equilbrium and t is the time.

    This would be a "total" derivative and not a partial derivative. The physical quantity called entropy is not changing. That is why I chose "d" instead of "∂" in the expression.

    Sometimes, writers use the word "change" when they really mean "rate of change".
     
  5. Dec 25, 2012 #4
    Re: "change" in entropy

    Change for an instant? What do you think calculus is based on? :tongue:

    As for calculation, fundamental thermodynamic relation can be given by [itex]dU = TdS - PdV[/itex], where dU, dS, and dV are infinitesimal changes in potential energy, entropy, and volume (respectively).
     
  6. Dec 25, 2012 #5

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Re: "change" in entropy

    I was going to say that!

     
  7. Dec 26, 2012 #6
    Re: "change" in entropy

    Again, the book's mean "rate of change of free energy is zero at equilibrium." Or rather, "the first derivative of the free energy with respect to time is zero at equilibrium."
    dG/dt=0
    where G is the free energy and t is the time. The letter "d" stands for "differential."

    The "derivative" is a concept from calculus. It can be expressed as:
    lim [dt→0] ={G(t+dt)-G(t)}/dt.
    where lim[] is the limit operator wiht respect to whatever is in the [], t is a specific time, dt is an increment of time.

    One sees that there is a "comparison" implied by the derivative operator. By saying the "rate of change of free energy is zero," one is also saying that there is a always and δ>0 for every ε> such that if dt<δ:
    |G(t+dt)-G(t)|< ε|dt|.
    Any time you evaluate a limit, you are making a comparison between two quantities. So the derivative of free energy with respect to time also implies a comparison between two values of free energy at slightly different times.

    "Equilibrium" means the state of not changing in time with time. Calculus just formalizes the concept.
     
  8. Dec 26, 2012 #7
    Re: "change" in entropy

    If the time rate of change is zero then surely future change ΔS is also zero?

    There was nothing in the OP quotes from his book that was time related. The introduction of 'instant' was his own.
     
  9. Dec 28, 2012 #8
    Re: "change" in entropy

    That may well be a problem with the book rather than the OP. Some writers forget to say with what independent variable the "change" in the dependent variable is associated with.

    Nevertheless, there are many textbooks and other references that state explicitly that the "change" is with respect to time. An equilibrium state is one that doesn't change with time.
     
  10. Dec 29, 2012 #9
    Re: "change" in entropy

    Thanks for the help guys! I guess I understand it better now.

    What he probably meant in the book by ΔG was the difference in absolute free energies of reactants and products at equilibrium was zero, and as mentioned on the wikipedia page it can be proved that

    ΔG(between reactants and products) = d(G)/dε (where ε is reaction coordinate) (I guess activity)

    Hence, similar to what Darwin123 said if d(G)/dt is zero, d(G)/dε is also zero, and hence ΔG is also zero.

    So may be the statement "difference" in free energies (of reactants and products) at equilibrium would more descriptive than "change in free energy".

    For entropy I guess what the text meant was that dS/dt is zero at equilibrium (like Darwin123 said)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook