Why does entropy increase?

  1. Can't the phenomenon of increasing entropy be explained as a result of the fact that in a collision of two particles the higher-energy particle always passes energy to the lower-energy particle (and never vice versa)? Hence energy becomes more evenly distributed in space...
     
  2. jcsd
  3. Mentz114

    Mentz114 4,240
    Gold Member

    Chegg
    Hmm. Can entropy be defined purely as the way energy is distributed in space ?
     
  4. Yes, that's how I understand it. Via particle collisions, energy flows spontaneously from a hot (i.e. highly energetic) region to a cold (less energetic) region and as a result energy becomes evenly distributed across the two regions and the temperature of the two regions becomes equal.

    The total energy passed from the hot region to the cold region is Q, the beginning temperature of the hot region is T1 and the beginning temperature of the cold region is T2. So during the energy transfer, the hot region's entropy decreases by Q/T1 (approximately, since the temperature doesn't drop instantly) and the cold region's entropy increases by Q/T2. Since T1>T2, total entropy of the system increases.
     
  5. though im not sure i don't think definition is right because it wouldn't support the "cold death" hypothesis for the universe
     
  6. Chris Hillman

    Chris Hillman 2,334
    Science Advisor

    Some salient characteristics of "entropy"

    Note that entropy is the central technical term in both statistical thermodynamics, a physical theory, and information theory, a (highly applicable!) mathematical theory allied to ergodic theory. It is essential to recognize that in both areas, many entropies are matheamtically defined and these definitions have a rather complicated mathematical relationship to one another. Thus, it is never proper to refer to "entropy" without qualification. Unfortunately, this bad habit is standard in the research literature, which is the cause of perennial and entirely unneccessary confusion :grumpy:

    More generally, please note that IMO the intersection between information theory and statistical thermodynamics is probably the most difficult thing in modern physics, involving as it does deep philosphical, technical, and interpretive problems. So don't underestimate the importance of being very very careful in how you think and write.

    Fortunately, despite some sloppiness of terminology, you appear to have hit upon some correct intuition. First, physical systems which can do useful work must have some (possibly abstract) "inhomogeneity" which can be "leveraged" (possibly in some abstract sense) to run some kind of (possibly idealized) "engine". Second, entropy does increase when one "homogenizes" a system (possibly in some abstract but deterministic sense, or possibly in a statistical sense). If you adopt a mathematical notion of entropy closely allied to the notions most often encountered, you can prove a lemma to this effect (but only valid for this particular notion of entropy).

    One of the simplest mathematical definitions of "entropy" occurs in a very simple setting: partitions of a finite set (no probability measure needed, or if you prefer, we impose "counting measure"). Boltzmann suggested defining the entropy of such the partition [itex]\pi[/itex] as the log of the multinomial coefficient
    [tex] S(\pi) =
    \log \left(
    \begin{array}{ccccc} & & n & & \\
    n_1 & n_2 & \dots & n_{r-1} & n_r
    \end{array} \right)
    = \log \, \frac{n!}{n_1! \, n_2! \dots n_{r-1}! \, n_r!}
    [/tex]
    where the [itex]n_j[/itex] are the sizes of the r blocks of the partition, so that [itex]n = n_1 + n_2 + \dots n_r[/itex]. For convenience we can allow zero values for the [itex]n_j[/itex], as long as this sum condition holds (and [itex]n>0[/itex]).

    Exercise: Assume that not all blocks have the same size. Suppose with little loss of generality that [itex]n_1 > n_2 +1 > 0[/itex]. What happens to the Boltzmann entropy if [itex]n_1 \rightarrow n_1-1, \; n_2 \rightarrow n_2+1[/itex]?

    Exercise: read Weeks 247-250 and 252 (the parts dealing with Kleinian geometry) of This Weeks Finds http://www.math.ucr.edu/home/baez/TWF.html
    Recalling that a Kleinian geometry arises whenever we have a group acting on a set, can you recognize a Kleinian geometry lurking behind Boltzmann's entropy?

    Exercise: Try to discover and prove further formal properties of Boltzmann entropy. Can you find "categorify" these to find analogous formal properties at the level of Kleinian geometry? Are these more general?

    Exercise: If you are familiar with another mathematical definition of "entropy", ditto.

    Exercise: If you are not familiar with any other mathematical definitions of "entropy", apply Stirling's approximation to the Boltzmann entropy and try to find some formal properties of the quantity you come up with. What is the natural mathematical setting for its definition?

    If any of this tickles your fancy, I recommend that you study one of the greatest (and most readable and most fun!) scientific papers of all time:
    http://www.math.uni-hamburg.de/home/gunesch/Entropy/shannon.ps

    Then read Thomas and Cover, Elements of Information Theory, Wiley, 1981. (By far the best introduction, the only one which adequately conveys some sense of the vast scope of this theory, which ranks with calculus, probability, and linear algebra, as one of the most applicable of all mathematical theories.) See http://www.math.uni-hamburg.de/home/gunesch/entropy.html for more fun on-line stuff.

    A good book for anyone utterly baffled by the above is Lawrence Sklar, Space, Time, and Spacetime.

    (Urgent warning!: articles on information theory and physics in the Wikipedia have been the subject of a bitter edit war between a lone dissident with very odd and inchoate but very determined views, and everyone else. No, don't ask, but I would urge anyone to strictly avoid reading Wikipedia articles on this topic until you know a great deal already, since a naive student could easily be misled and eventually have to try to "unlearn" vast amounts of misinformation. Much much better to stick with reliable sources while you are still a student!)
     
    Last edited: Jun 3, 2007
  7. hmm so entropy obeys conservation of energy?
     
  8. that's exactly what your post implies, that energy is conserved but become impotent.

    edit

    maybe my understanding of energy is elementary, by virtue of its inability to do work it is no longer energy is what you mean.
     
  9. does the total amount of energy in the closed system of the universe remain constant?
     
  10. no offense but telling me to rifle through other threads of abstruse information doesn't lend to my understanding. a glib answer would have sufficed.
     
  11. i'm faced with a problem indemnic to people like me. i yearn to know advanced topics but to know these advanced topics i need the foundation. i'm working on a foundation, i'm a begining physics major. for now though if i am curious about something it is pointless to try to read scholarly material because it is very opaque to me. so if i need a curiousity satisfied the best that i can hope for is a simplified answer.

    though i can understand if and why you would unwilling to provide one, it is still very stultifying
     
  12. Searching for an answer would have helped you out more. That is all he is trying to hint at. If you are not willing to at least look at some threads with similar issues discussed it must not be all that important.
     
  13. Thanks Chris, I thought that the phenomenon of increasing entropy, or irreversible spontaneous dispersal of energy, could be explained simply from the fact that a more energetic particle always passes energy to a less energetic one, without any statistical ambiguity. Seems the matter is more complicated though :surprised
     
  14. Entropy comments


    If the enenrgy would be transfrerred from high-energy particle to lower ones, then the all the chemical reactions in the universe would've gone to completion, but it does not work this way. THe reaction comes to halt when the entropy generated is maximum, i.e. the products formed and the reactants left must have a high disorderness created. This is where the reaction is under equilibrium.

    According to what i've read in the book, change in entropy can be negative- which means that energy can be transferred from lower energy reservoir to a higher one, this requires some work in. But the entropy generated can't be negative in real life due to the irreversibilities present, like friction.

    If we consider an isloated universe (adiabatic) in which there are subsystems. Though we are not able to transfer heat to the system but even then the entropy of the universe (system) is generated due to irreversibilities present during the process going on within the system.

    Final Comment: THe energy of he particles in the rxn is conserved but the entropy is generated or created during rxn, i.e. at equilibrium pt this causes the rxn to stop from proceeding furhter (or go back) w/o reacting all the high energy molecules.

    I hope this helps u.
     
  15. Chris Hillman

    Chris Hillman 2,334
    Science Advisor

    Yes, as sd_barry said, that's not really true. BTW, maximizing entropy is Legendre-dual to miniming free energy. In ancient days I used to have a post archived at my website which explained Legendre duality in two ways, one geometric and the other analytic.

    Another comment on what sd_barry said: entropy in statistical mechanics is properly understood against the background of ergodic theory. The book by Lawrence Slkar, Space, Time, and Spacetime has some excellent nontechnical discussion of the Poincare recurrence theorem, which bears directly upon the Second Law.
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?