Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy definition

  1. Nov 10, 2009 #1
    Hi All,

    I have a question on entropy.

    I had some eposure to classic thermodynamics and I remmebr the entropy being defined as the integral of dq / T ("differential" heat / absolute temperature).

    Hence for any adiabatic transformation the entropy change is zero.

    Then I came across a different definition, involving the logarithm of the possible number of states, with its related ideas of order, disorder, time arrow, etc.

    I tried to understand more and so far I am not even sure the two are equivalent.

    Indeed one can imagine a system where work is done, achieving some level of order improvement, istill n adiabatic condiitons.

    I understand this being a trivial argument and would welcome any help, reference or comment.

    Thank you and Best Regards

    Muzialis
     
  2. jcsd
  3. Nov 10, 2009 #2
    Actually that's not a trivial question.
    Heat is not difficult to define here. It's whats left over so [itex]\mathrm{d}Q=\mathrm{d}E+p\mathrm{d}V[/itex]. However it's not obvious how one should define temperature here! The definition with Carnot reversible engines shows, that under some assumptions and if you manipulate a set of systems reversibly, then indeed the temperatures and entropy can be found from [itex]\mathrm{d}S=\mathrm{d}Q/T[/itex]. For this definition you have to make sure you really know what reversible means for whatever system you are dealing with.

    That's a definition I prefer. Using this definition one can show that if you want to maximize entropy while having a definite energy and volume, you get
    [tex]\mathrm{d}E=x\mathrm{d}S-y\mathrm{d}V[/tex]
    where x and y are just parameters at this point. Logic in physics can identify y as the pressure provided it is constant at all of the boundary. Now if you also identify x as the temperature, you get back to the same definition as in the thermodynamic case.
    All this derivation requires some statistical mechanics. You can look up some concepts. I could write down the derivation once you understand it a little.

    Basically you can take the [itex]S=\ln \Omega[/itex] definition of entropy and if you are willing to identify x as the temperature (you don't have another definition of temperature at this point), then you get back to the thermodynamic definition.

    Or you stay with macroscopic quantities only, search for reversible process and define entropy and temperature by [itex]T_1/T_2=\mathrm{d}Q_1/\mathrm{d}Q_2[/itex] and [itex]\mathrm{d}S=\mathrm{d}Q/T[/itex] if you observe a heat transfer (under reversible conditions for the set of systems) due to a Carnot engine which at the same time generates or consumes work.
    With this macroscopic picture you cannot say anything about microscopic states. If your microscopic system turns out to have to required reversible dynamics, then you get you equate these definitions again.

    Strictly speaking the Carnot definition can't say anything about entropy or temperature, if the total processes isn't reversible. In that case the best you can do is to hope you are dealing with a constant temperature process, so that you at least know the temperature (which is what it was before). I think there is no guarantee that the entropy changes wildly. The only thing you do, is to say that freely system will evolve only to a state of increased total entropy.
     
  4. Nov 12, 2009 #3
    Gerenuk,

    thank you for your useful explanation. I am looking up the concepts but would still be very interested in seeing the derivation of the equation you proposed.

    I also thought I would present the concrete case stimulating my curiosity.
    It is a classic theory on rubber elasticity, the main traits presented at for example http://www.doitpoms.ac.uk/tlplib/stiffness-of-rubber/entropy.php

    Rubber stiffness is motivated by entropic considerations (using the log definition), but I am struggling to understand how rubber's stiffness could be justified in adiabatic (hence dS=0) conditions.

    Thank you very much again

    Muzialis
     
  5. Nov 12, 2009 #4

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Make sure to distinguish between the total entropy change (which is zero for an adiabatic reversible process) and the configurational entropy change of the polymer chains (which decreases with deformation).

    Just because the first is approximately zero doesn't disprove the model of rubber elasticity. Entropy can also change (or balance out to zero) due to a temperature change in the material. (Question: would adiabatically, reversibly deformed rubber get hotter or colder?)
     
  6. Nov 12, 2009 #5
    Hi there,

    and many thanks for your response. The introduction of total and partial entropy confused me even more, but this is just usual before understanding.

    IF you stretch rubber it gives heat out. I appreciate, in adiabtaic conditions, this would mean the temperature would increase.

    Still, if in adiabatic conditions the total entropy variation is zero, and the conformational entropy is changing, what else is balancing out the entropy? How can this be done by temperature?

    thank you very much.

    Kindest Regards

    Muzialis
     
  7. Nov 12, 2009 #6

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The entropy is balancing out by an increase in temperature. The configurational entropy of the straightened chains decreases, the vibrational entropy of the bonds increases with increasing temperature, and the total entropy change is approximately zero.

    Interestingly, the opposite is predicted in metals; since stretching increases the entropy of primary bonds, the temperature is predicted to decrease during adiabatic reversible stretching.

    (But note that no process is truly reversible, and irreversibility will tend to increase the temperature during stretching in both cases due to lost work.)
     
  8. Nov 12, 2009 #7
    Mapes,

    many thanks for this.

    Basically the thermodynamic definition is concerned with global entropy, while you are talking about partial contributions.

    But then, I am still perplexed. In the entropic theory of rubber elasticity, it is said the restoring force after a sample os tretched derives from the lower (conformational ) entropy of this better organized state, hence creating the tendency for going back to the relaxed configuration (higher entropy).

    From what you say, this concept applies to conformational entropy. why it does not apply to the bond vibrational one?
    In the equation dU = TdS + FdL (just the first law of thermodynamics for a piece of rubber stretched by a Force L along a length L), the dS is entropy, not vibrational or conformational..

    But my most vivid thanks for all your posts, I am sure I am getting there

    Muzialis
     
  9. Nov 12, 2009 #8

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Because bond energy dominates over entropy when we're talking about primary bonds. The mechanism of elastic recovery in this case is the driving force to find a minimum-energy bond length rather than the driving force to increase conformational entropy. (Does this answer your question?)
     
  10. Nov 12, 2009 #9
  11. Nov 12, 2009 #10

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I must be misunderstanding. What's your complete question? :smile:
     
  12. Nov 12, 2009 #11
    Mapes,

    in relation to our last conversation, the question is, why does rubber recover from the stretched configuration?

    the bond energy is not deemed the cause at all in the link I sent you, rather the reason is the conformational entropy.

    But as you say the conformational entropy is counter balanced by the vibrational one, (as one would expect from the fact the transformation is adiabatic,dS = 0).

    So now,if this dS = 0, then the equation in the link F = dS / dL, indicated as the origin of the force, says that in adiabatic condition rubber has no stiffness!

    Hope it is clearer now, and thnks for your patience

    Muzialis
     
  13. Nov 12, 2009 #12

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Ah, got it. It's two different entropies. For a polymer,

    [tex]F\approx\frac{\partial}{\partial L}(T dS_\mathrm{conformational})[/tex]

    For an adiabatic, reversible process,

    [tex]dS_\mathrm{total}=0[/tex]
     
  14. Nov 12, 2009 #13
    Mapel,

    that is exactly the point! Why, for "a polymer", entropy equals conformational entropy? Is there not a level of arbitrareness there? In the laws of thermodynamics no specific entropy is mentioned!

    Thanks again

    Muzialis
     
  15. Nov 12, 2009 #14

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Well, there's always vibrational, configurational, conformational, electronic, nuclear, etc., components of entropy. But the conformational entropy is what gets tweaked most when you stretch an elastomer. That's why the other terms are assumed to be negligible in that first equation.
     
  16. Nov 12, 2009 #15
    Mapel,

    you have to forgive this combination of laxck of undertsanding and stubborness...

    I u8nderstand there might be various entropic components, but if as you say any but the conformational is considered negligible for the case of an elastomer, then also the entropy balance can not be zero (as the other negligible contribution ensure the total entropy variation in an adiabatic process is zero).

    Are not facing a contraddcition here?

    Basically I say, adiabatic hence dQ = 0, hence dS = 0. If any contribution to entropy apart form the conformational one is negligible, then dS can not be zero.

    Thank you for this most interesting discussion

    Muzialis
     
  17. Nov 12, 2009 #16

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I do see what you're saying. The problem may lie in treating a polymer thermodynamically as a homogeneous system. When considering the stretching of individual chains, this assumption breaks down, and we have to assume that the only significant consequence of stretching is a decrease in conformational entropy. Sorry, I haven't seen a better explanation.
     
  18. Nov 12, 2009 #17
    Mapes,

    I see what you say.

    Let memthank you very much for your valuable inputs.

    Have a good evening

    Marco
     
  19. Nov 12, 2009 #18
    I'll do the derivation for energy and volume. It is similar if you have a string with a length (instead of a volume).
    First you consider a lot of identical systems since all your statements about one system are actually rather a statistical statement about an ensemble of many copies of your systems. If you look at all these systems, then [itex]n_i[/itex] of them will be in a state [itex]i[/itex] with energy [itex]E_i[/itex] and volume [itex]V_i[/itex]. Let's take an example. If you label your system with indices [itex]i\in \{1,\dots,5\}[/itex] now it might look like 112322355333343 and ordered is 112223333333455 (I ordered the copies according to their index). Now if all other combinations of 1,2,3,4,5 are also possible (say 111111111113335), how many of these combinations will give exactly the same combination as in my first example? Combinatorics tells us that (as order doesn't play a role) it will be
    [itex]\frac{15!}{2!3!7!1!2!}[/itex] or in general
    [tex]
    \Omega=\frac{n!}{\prod n_i!}
    [/tex]
    Of course we assume that looking at the problem this way with the labeling, a state like 112223333333455 is much more likely than 111111111111111, since the latter is unique whereas the first is the same as a reordered 131233223334535 and all other permutations. Defining
    [tex]
    S=\ln\Omega
    [/tex]
    that's basically the second law of thermodynamics saying that entropy if an isolated system always increases (I skip the Boltzmann constant for shortness). Since there are so many particles involved in a gas, one can show that the "more likely states" are actually incredibly more likely than all others. That's why the second law isn't violated even though in could be.

    --------------- TBC ---------------
     
    Last edited: Nov 12, 2009
  20. Nov 12, 2009 #19
    We can introduce the probabilities
    [tex]
    p_i=\frac{n_i}{n}
    [/tex]
    that a copy of those many is in state i. With the assumption that n is a very large number (and some more large number assumptions), we can use Stirling's approximation of factorial to derive
    [tex]
    S=-\sum p_i\ln p_i
    [/tex]

    And remember we argued that the entropy should attain a maximum
    [tex]
    S\to\text{max}
    [/tex]
    Additionally we would want the average energy and volume of all these copies to be that values energy and volume that we measure on our single system. This corresponds to the equations
    [tex]
    E=\sum_i E_i p_i
    [/tex]
    [tex]
    V=\sum_i V_i p_i
    [/tex]
    and of course
    [tex]
    1=\sum_i p_i
    [/tex]
     
    Last edited: Nov 12, 2009
  21. Nov 12, 2009 #20
    The task of maximizing the entropy under the given contraints can be solved with Lagrange multipliers to find
    [tex]
    p_i=\frac{e^{\beta E_i-\alpha V_i}}{Z}
    [/tex]
    [tex]
    Z=\sum_i e^{\beta E_i-\alpha V_i}
    [/tex]

    Plugging [itex]p_i[/itex] back in into the definitions of [itex]S[/itex] and doing a bit of maths you get
    [tex]
    \mathrm{d}S=\beta \mathrm{d}E-\alpha \mathrm{d}V
    [/tex]
    which you can rearrange to
    [tex]
    \mathrm{d}E=\frac{1}{\beta}\mathrm{d}S-\frac{\alpha}{\beta}\mathrm{d}V
    [/tex]
    Now we define these global parameters, that so far have no meaning, as temperature and pressure. The identification with pressure actually makes sense since we know from physics where pressure has a meaning that [itex]\mathrm{d}E=-p\mathrm{d}V[/itex] (for adiabatic changes).

    This way you arrive at your identity
    [tex]
    \mathrm{d}E=T\mathrm{d}S-p\mathrm{d}V
    [/tex]

    The advantage over the direct thermodynamics definition is that you still have the framework behind with all it's equations about possible states of the copies [itex]E_i[/itex], [itex]V_i[/itex] and the probabilites [itex]p_i[/itex] which you can calculate with the partition function [itex]Z[/itex], provided you know these system state energies and volumes. So you can calculate one way, but not the other way round. You still know that
    [tex]
    p_i=\frac{e^{-(E_i+pV_i)/T}}{Z}
    [/tex]
    [tex]
    Z=\sum_i e^{-(E_i+pV_i)/T}
    [/tex]
    [tex]
    S=\frac{E}{T}+\ln Z
    [/tex]
    (Note: maybe I got some signs wrong somewhere :smile: but the result is correct)
     
    Last edited: Nov 12, 2009
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook