Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy and enthalpy

  1. Feb 29, 2008 #1
    I decided to try and learn what entropy is today and I swear to god I've been sitting here for 4 hours and I still don't have the foggiest idea of what the hell is it? Its driving me insane. I can't think anymore cuz of the stress thats building up from the fact that I just can't comprehend the concept.

    What I've read is that its ties into the second law of thermodynamics and that basically it is the measure of the forces that tend to disperse molecules or heat and distribute is uniformly in a closed system. That makes perfect sense to me.

    Heres where the contradictions start. Other sites say that entropy is the measure of disorder and that nature tends to go towards an unorganized, disordered state. Personally I see the dispersion of matter or energy to achieve a uniform state as organized and ordered. Theres nothing disorganized about that.

    What am I missing here? I can't make any sense of the explanations on the internet. Some of them say if you have 2 metal blocks and one is hotter than the other. Lets say block 1 is hot and block 2 is cold. They say that if heat transfers from block one to block 2 that the entropy of block 1 rises while the entropy of block 2 decreases. If thats the case I have no idea what entropy is.

    I'll be too pissed off to go to bed unless I understand the concept and the way things are looking I'm not going to be sleeping tonight. Can anyone help me understand the concept? If I can just figure out what entropy is I might be calm enough to go to sleep. I'll leave enthalpy for another day.
     
    Last edited: Feb 29, 2008
  2. jcsd
  3. Feb 29, 2008 #2
    Technically,

    [tex]S = kln\Omega[/tex]

    Where k is the boltzmann's constant and [tex]\Omega[/tex] is the multiplicity, or, the number of microscopic states the system can take on that have the same energy. It's easy to figure out what [tex]\Omega[/tex] is for simple systems, and one generally finds that the the multiplicity function is sharply peaked, it's really, really unlikely to find a system far away from the state of maximum entropy. So, if a system starts in a low entropy state (a block of ice), it will tend to go to a higher entropy state (a puddle). The second law of thermodynamics isn't really a physical law in the sense of [tex]F = Ma[/tex], but the statistics always work out such that the probability of the entropy not increasing isn't even worth considering.
     
  4. Feb 29, 2008 #3

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Entropy is a difficult concept to understand, no doubt about it. It's a "something" that seems to pass from one body to another and can apparently be generated from nowhere.

    Forget about the disorder and the dispersal analogies, they're too flawed to work for you. The most fundamental definition of entropy that I know of is [itex]S = -k \sum p_i \ln p_i[/itex] where [itex]p_i[/itex] is the probability of the system being in microstate [itex]i[/itex] (a microstate is when each particle has an assigned quantum state that is compatible with macroscopic observables like temperature and pressure). If all microstates are equally probable, we have the familiar [itex]S=k\ln \Omega[/itex], where [itex]\Omega[/itex] is the number of microstates. The best description that I've heard of the 2nd law is that entropy (= number of microstates) tends to increase, and that entropy is maximized when a system is at equilibrium.

    Here is the problem with the disorder argument: disorder is subjective. Who's to say whether a messy room is randomly disordered or whether each items has been carefully positioned, with definite rules, by its owner? Additionally, disorder on the microscale and on the macroscale can be difficult to identify. Alluding to your point, a glass of crushed ice looks disordered, while a glass of water looks ordered. However, the glass of water has far more entropy.

    Here is the problem with the energy dispersal argument: Consider two rings of pure material spinning in opposite directions at a very low temperature, arbitrarily close to absolute zero. The system velocity and angular momentum are zero; the only important number is the rotational speed. There are very few possible microstates (tending to one as we approach absolute zero) that are compatible with the system, because random atomic motion is nearly eliminated due to the low temperature. Each atom is pretty much limited to following its circular path with essentially no thermal vibration. The entropy is very low, almost zero.

    If the rings are now brought into contact, they will eventually slow each other to a stop by friction. Now the rotational speed is zero and the material is hotter, say at some temperature [itex]T[/itex] well above absolute zero. There is now a huge number of possible microstates, because the random thermal energy could be apportioned to the particles in an enormous number of combinations without us ever knowing the difference. (It doesn't matter whether atom #45,391,567,958,... is going fast and #198,562,994,261,... is going slow or vice versa, as long as the energies add up to put the bulk material at temperature [itex]T[/itex].)

    This is where I have a problem with "energy dispersal." The energy isn't more disperse after we connect the rings. The energy didn't go anywhere, the system is closed. Neither has the energy spread out spatially. The average energy of the particles is still the same. I think the dispersal definition falls short here, while the microstates definition explains the spontaneity of the process with no problems.

    So I encourage you to think in terms of microstates, as nathan12343 pointed out above. When we heat a system, its number of microstates increases. When we cool a system, its number of microstates decreases (but the energy we removed increases the number of microstates of the surroundings). We we do work on a system, there is no change in the population of microstates. The number of possible microstates in the entire universe tends to increase (this is the Second Law).

    It may be useful to think of entropy as something that "flows", but you have to be careful (nothing is actually flowing). Entropy is the conjugate extensive variable to temperature, just as volume is the conjugate extensive variable to pressure. Just as two systems will exchange volume in order to equalize their pressure, two systems will "exchange" entropy to equalize their temperature. But what is really happening is that energy is being transferred, increasing the number of possible microstates in one system while decreasing the possible number in another.

    Finally, you should know that entropy is conserved for reversible processes, but entropy is created whenever energy is transferred in response to a gradient in an intensive variable, like temperature or pressure. In fact, "reversible" means that energy is being transferred without any gradient in temperature, pressure, etc. This never occurs in reality, but we can come arbitrarily close, and it's a useful idealization.

    Good luck with your efforts!
     
  5. Feb 29, 2008 #4

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    I'm not sure anyone understands "entropy". It's one of the more difficult concepts, tying in threads from mechanical work, statistics, information, probably some others. Part of the difficulty is there there's not a good definition of 'energy', either.

    First, understand that not all energy possessed by an object is accessible. The food we eat, we can only use (IIRC) 80% of the calories.

    Second, understand that the amount of energy possessed by an object is due, in part, to it's configuration. A folded protein has a different energy than an unfolded protein. A ball up in the sky has a different energy than the ball on the ground.

    "Entropy", as I think of it, is a measure of the amount of energy within an object (or system) that we cannot access. That's not very precise, and there's lots of mathematical derivations assigning a value to that amount of energy: in terms of statistics (kT ln(W)), in terms of information (kT ln(2)), in terms of thermodynamic concepts (dS = dQ/T), probably some others.

    I really struggled with thermodynamics for a long time, trying to get an intuitive feel for all those different energies (enthalpy, Gibbs, Helmholtz, etc.), Maxwell relations, logarithmic derivatives, and all the time wondering what's the point. It may help to remember that thermodynamics is one of the oldest branches of Physics- it predates knowledge of atoms. The concepts used may appear foreign to us, as we have become accustomed to quantum mechanics and microscopic dynamics.

    That, coupled with an embarrassing lack of a decent undergrad text, causes no end to headaches. FWIW, if you can find a copy of Truesdell's "Tragicomic history of Thermodynamics", you may find it helpful.
     
  6. Feb 29, 2008 #5
    As Andy_Resnick pointed out, thermodynamics is one of the
    oldest branches of physics, so it must be possible to make
    sense of entropy without recourse to statistical mechanics.
    Not that there is anything wrong with statistical interpretations
    of entropy. In fact, many thermodynamic properties can be
    accurately estimated with the help of statistical mechanics,
    and the statisical interpretation adds a great deal of insight
    into the concept of entropy.

    Have you attempted to calculate the net entropy change when
    the two blocks of metal are put into contact with one
    another? If you can calculate this, it might help in
    understanding the entropy concept better.

    Suppose you consider just one block of metal (the system) at
    a temperature T2 which is then put into contact
    with a huge heat bath held at temperature T1
    (i.e. a bath with essentially infintite heat capacity) such
    that T2 > T1. After a while, the block
    will have cooled to temperature T1 and will have
    given up an amount of energy Q = C(T2-T1),
    where C is the heat capacity of the metal, here assumed to be
    constant over the temperature range considered. Since the heat
    bath was kept at temperature T1 the whole time, the
    entropy change of the bath is Q/T1.
    Unfortunately, we don't know
    the entropy change of the block because the cooling process was
    not carried out reversibly. To find the change, the system must
    be restored reversibly to its original state. This can
    be done by placing the block in contact with successively hotter
    heat baths whose temperatures differ from each other by an
    infinitesimal amount. The heat absorbed by the block in each
    infintesimal step is dQ = C dT and the entropy change of the
    block is (dQ)/T = (C/T)dT. The total entropy change of the
    block is then the integral of this between the limits T1
    and T2: C log (T2/T1). Now
    the entropy is a function of state, depending only on the
    temperature in this case, so that the entropy change of the
    block in the cooling process is just the negative of this.
    The net change in entropy for the cooling process is therefore

    [tex]\Delta S_{total} = \Delta S_{bath} + \Delta S_{block}[/tex]

    [tex] = C(T_2 - T_1)/T_1 - C \log (T_2/T_1) > 0 [/tex]

    I now invite you to calculate the net entropy change for the
    process you mentioned involving two blocks,
    one at T1 and one at T2, put into thermal contact with one another.
    HINT: Assume this process is adiabatic, i.e. there is no heat
    exchange with the bath during the temperature equilibration.
    Then restore each block separately to its original state in
    order to calculate the net entropy change in the process.

    An excellent text is "Thermodynamics" by G.N. Lewis and M. Randall
    (McGraw-Hill, 1961)
     
  7. Mar 1, 2008 #6

    GT1

    User Avatar

    Why when work is done there is no entropy change ? If work is done by the system = less energy on the system , why the entropy doesn't change when the energy of the system changes ?
     
  8. Mar 1, 2008 #7

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Let's look at a practical example: a gas in a closed container. If we allow the gas to expand reversibly and do work on the environment, its energy decreases as you said (and its temperature also decreases). If the volume had stayed constant, the entropy would have decreased too. However, the volume has increased, allowing the gas more possible microstates. This increase in entropy exactly offsets the decrease due to the loss of energy.

    Mathematically,

    [tex]dU=T\,dS-p\,dV[/tex]

    [tex]dS=\frac{1}{T}\,dU+\frac{p}{T}\,dV=0[/tex]
     
  9. Mar 1, 2008 #8

    GT1

    User Avatar

    Thanks Mapes!
     
  10. May 5, 2010 #9

    cwconnell

    User Avatar
    Gold Member

    I do not have any quick answers but entopy is the degradation of the quality of energy. If you have an engine that burns fuel, e.g. automobile, energy is being degraded and there is no return to that energy level for the closed system. Ever since the universal "bang" and ever since you were born (or maybe 10 in my son's case) you are going down hill. As time passes, the quality of energy of the universe is decreasing. [This applies to a closed system.]
     
  11. May 5, 2010 #10
    I think there is an explanation due as to why Boltzmann's constant 'k' should carry over into statistical mechanics.

    James
     
  12. May 6, 2010 #11

    Mapes

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Briefly, entropy is proportional to the log of the number of microstates, a dimensionless number. Boltzmann's constant is the constant of proportionality that gives entropy its units (J/K) and connects it to our macroscale measurements of energy and temperature.
     
  13. May 6, 2010 #12
    "...Boltzmann's constant is the constant of proportionality that gives entropy its units (J/K) and connects it to our macroscale measurements of energy and temperature. "

    It is that connection that I question. Not because I think it is wrong, but because the connection appears to be given from thermodynamics to statistical mechanics for free instead of being derived from statistical mechanics. To me that makes it appear that the equation is still a thermodynamic equation and the statistics part is added on to it, because the statistics are directly related to the reordering of energy. In other words, the success of the equation still is rooted in the thermodynamic derivation of thermodynamic entropy. This is not intended to be taken as an expert opinion.

    James
     
  14. May 6, 2010 #13
    I tend to think about this in the way explained by Mapes and Nathan.

    A qualitative explanation I like goes as follows. Macroscopic objects consist of atoms, which in turn consist of subatomic particles etc. etc. When we in practice describe the world we perceive, we do that in terms of macroscopic variables. We don't specify the exact physical state of objects. Even if we wanted to do that, lack knowledge about the exact fundamental laws of physics would mean that we couldn't do that anyway.

    What makes doing physics possible at all is that one can find a closed description of a physical system in terms of only the macroscopic variables plus a handful extra variables that in some way describe the effects of all the degrees of freedom that are left out, in a statistical way. In thermodynamics those extra degrees of freedom are quantities like internal energy, temperature, entropy etc. etc.
     
  15. May 6, 2010 #14
    It is a matter of choosing your units. Thermodynamics gives you a phenomenological description that is not able to explain how all the variables are related to each other. That means that you'll end up with variables that are related at the fundamental level, but in the thermodynamic description you cannot see that relationship.

    Historically, the thermodynamic relations were found empirically, and units were invented to measure such quantities like temperature. But if later a more fundamental theory arises and we can now directly compare what used to be incompatible physical quantities, you will end up with new constants, in this case the Boltzmann constant that will do the unit conversion from temperature to energy in the old thermodynamic units.


    Comnpare this with special relativity. Einstein found that the inertia of an object is explained by the energy of the object. But in classical physics the two quantities energy and mass are unrelated and we already have defined our supposedly incompatible units for the two quantities. But relativity tells us that declaring the two quanties to be incompatible is wrong and that in fact mass is precisely the rest energy of a object. What then happens is that the equatons will automatically compensate for our previous ignorance by doing the unit conversion inside the equation, i.e. we get the equation E = m c^2 instead of E = m.
     
  16. May 6, 2010 #15
    Does anyone know if entropy affects, or applies to non material objects such as information states?
     
  17. May 6, 2010 #16

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    Of course! Look up "information entropy", 'negentropy', and Shannon's information theory.
     
  18. May 6, 2010 #17

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    That's clearly false, amply demonstrated on the other relevant thread.
     
  19. May 6, 2010 #18
    I should have written "phenomenologically" or "heuristically". Physics often proceeds in a heuristic way and only with hindight can you figure out how things really work. That's why, i.m.o., the historical approach to physics teaching is not so good.
     
  20. May 6, 2010 #19
    So if I were to roll a pair of dice, would there be any implications for entropy in relation to the numbers that appeared on the faces? For instance would there be any difference in entropy if a seven came up as opposed to a twelve?
     
  21. May 6, 2010 #20
    I am not quite clear about your explanation: Are you saying that Boltzmann's constant did result from the derivation of statistical mechanics or that it was adopted from thermodynamics and used because it was convenient to use it for unit conversion?

    James
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Entropy and enthalpy
  1. Entropy ? (Replies: 2)

  2. Enthalpy of expansion (Replies: 1)

  3. Enthalpy of reaction (Replies: 6)

Loading...