Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Does equilibrium imply max. entropy in statistical mechanics?

  1. May 18, 2005 #1

    JesseM

    User Avatar
    Science Advisor

    Does "equilibrium" imply max. entropy in statistical mechanics?

    I've gotten myself confused thinking about the meaning of "equilibrium" in statistical mechanics. I thought I remembered that an isolated system at equilibrium is equally likely to be in any possible microstate, which means there would be some very tiny probability it would be in a low-entropy state far from the maximum entropy possible for the system. But when physicists talk about "non-equilibrium thermodynamics", aren't they talking about systems which are far from maximum entropy? Perhaps the difference is that non-equilibrium thermodynamics is dealing with systems that are not isolated, but are receiving energy from the outside or something? Or maybe my definition of equilibrium is wrong, and an isolated system at equilibrium is not equally likely to be in every possible microstates, but just those microstates corresponding to a maximum-entropy macrostate?
     
    Last edited: May 18, 2005
  2. jcsd
  3. May 18, 2005 #2

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    That's absolutely true.The highlighted part could be rigurously proven using the axiomatical formulation of equilibrium SM.


    What do you mean by far away?How do you measure "far"...? :uhh:

    Well,of course it's dealing with nonisolated systems,else,if the systems would be isolated,how would the non-equilibrium states appear...? :uhh:

    Yep,it's true that if one imposes the maximum condition on the entropy functional,he gets equally probable microstates.I haven't been taught the other way around.You might search for it in textbooks on SM.

    Daniel.

    P.S.I hope u know the condition of equilibrium for a statistical system,right...?
    Time-independent hamiltonian in the Schrödinger picture of classical and quantum mechanics+Time independent probability density/density operator for the statistical system.
     
  4. May 18, 2005 #3

    JesseM

    User Avatar
    Science Advisor

    It shouldn't really matter--no matter what cutoff you use for how low entropy must be before it's "far from maximum entropy", there should be some nonzero probability the system will have entropy this low, right? (assuming you restrict yourself to entropies which can be found in states somewhere in the system's phase space, of course). Just as an example, if all the molecules in a box filled with gas collected in one corner of the box, then I think most people would call that far from maximum entropy--would you still say the gas was at equilibrium if it spontaneously went into this state?
    Like I said, I was confused about whether "equilibrium" was a statement about the entropy or about the probability of different states. If you wait some sufficiently huge amount of time, random fluctuations in a system at equilibrium will produce states whose entropy is far lower than the maximum, no? And yet it would still be correct to say the system is at equilibrium at these rare moments, correct?
    Thanks, I didn't know that, although I might have learned it at some point when I took a course on this stuff.
     
  5. May 18, 2005 #4

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Funny you mention fluctuations.You know that for equilibrium ensembles,they go to zero in the thermodynamical limit for all obervable quantities.So the more or less "sponteous" separations of the molecules (which would violate Boltzmann's H theorem) are extremely improbable.But the probability is nevertheless nonzero.

    Daniel.
     
  6. May 18, 2005 #5

    JesseM

    User Avatar
    Science Advisor

    This course page from Arizona State University says that for an isolated system, "equilibrium" means maximum entropy, while the idea that each microstate should be equally probable is called the "principle of ergodicity":
    So would you say that the definitions on this page are wrong, or does "equilibrium macrostate" mean something different from "equilibrium ensemble", or did I misunderstand your earlier answer?
     
  7. May 18, 2005 #6

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Nope,i said it.It all depends on the approach.There are 2 ways of doing SM.The traditional way,or the axiomatical way.

    I've explained what is an equilibrium macrostate for a statistical system.That and the equal probability for accessible microstates in the microcanonical ensemble induces a maximum entropy.

    Daniel.

    P.S.The principle of ergodicity (the ergodicity hypothesis initially formulated by Boltzmann) is something else.
     
  8. May 18, 2005 #7

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    The whole point is that the traditional approach to SM (see for example is sections 6.1 pp.6.3 from [1]) is rather confusing.It's the reason for which theorists go for axiomatical theories.

    Daniel.

    ---------------------------------------------------------
    [1]K.Huang,"Statistical Mechanics",Wiley,2-nd edition,1987.
     
  9. May 18, 2005 #8

    JesseM

    User Avatar
    Science Advisor

    Which section of your earlier posts are you referring to?
    Does "accessible" mean all possible microstates the system could be in, or only those microstates compatible with the maximum-entropy macrostate?

    This lecture slide gives an example of a system where the only parameter distinguishing different macrostates is nL, the number of particles in the left-hand side of a container. The next slide defines the "equilibrium macrostate" as the macrostate whose value of nL has the largest number of microstates. For example, if the number of particles N was 1000, the equilibrium macrostate would have nL=500. So, would the "accessible microstates" which are given equal probability in the microcanonical ensemble mean the set of all microstates with nL=500, or would it mean all possible microstates, with nL taking any value between 0 and 1000?
     
  10. May 18, 2005 #9

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    The post-scriptum of post #2.

    Well,this "accessible microstates" is the nastiest and most unintuitive part of the microcanonical ensemble.Read Huang.

    Daniel.
     
  11. May 19, 2005 #10

    JesseM

    User Avatar
    Science Advisor

    The amazon reviews say that Huang gives a very abstract and mathematically advanced presentation, so given the fact that my knowledge of thermodynamics is very elementary (only took one course in college, and I've forgotten most of it, although some would probably come back to me if I reviewed my old textbook) I'm not too confident I'd be able to follow it. The reviews also say Huang focuses on the "kinetic" approach to thermodynamics rather than the "equilibrium" approach, and I think the equilibrium approach is what I studied. But I'll look for it at the library and see if it's any help.

    In the meantime, is it possible to say what "accessible states" would mean in the specific case I mentioned where the only macro-parameter is nL, the number of particles on the left side of the chamber? Or is there no simple answer to this question?

    I notice that this page says:
    So, equation 18 indicates the W microstates they are referring to as "accessible" are only the ones with entropy S, suggesting that states with different entropies would not be considered "accessible". And it is only these states they are calling equiprobable, while all other states are defined to have probability zero--does the last line mean this would be the probability distribution of the microcanonical ensemble, if S is taken as the maximum entropy?
     
  12. May 19, 2005 #11

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Yes,i've told u.The probability density has a typical theta-Heaviside dependence and is zero for "unaccessible" states,and 1/#of accessible microstates for these accessible microstates.

    Then Landau & Lifschitz.I've also told u about Greiner,Neise & Stocker.

    Daniel.
     
  13. May 19, 2005 #12
    JesseM

    Random questions.

    Equilibrium in usual sense of macroscopic bodies with short range forces, without ligadures, and isolation with stability implies a maximum in entropy.

    Yes, in non-equilbrium thermodynamics one deals with systems outside of state of maximum of entropy corresponding to eq. One deal with states of non-maximum entropy in the usual sense.

    Effectively, an isolated system at equilibrium is equally likely to be in every possible microstates compatible with ligadures at the eq. state. The rest of possible nonequilibrium microstates are not visited at the equilibrium state, but can be instantaneous visited by means of thermal fluctuations.

    You can that when the number of particle at left and right is not the same the macroscopic chemical potential at both sides is not the same, violating the macroscopic condition for chemical (or matterial) equilibrium that can find in any standard textbook.

    E.g. Chemical Thermodynamics: Basic Theory and Methods, 6th Edition -- by Irving M. Klotz, Robert M. Rosenberg.

    dextercioby

    let me a small comment, your

    “That's absolutely true.The highlighted part could be rigurously proven using the axiomatical formulation of equilibrium SM.”

    Would better read like “is derived from equilibrium SM using some ad hoc unproven arguments”


    JesseM

    If you force the molecules to remain in the corner, e.g. with a wall, it is equilibrium (forced). Other case is not equilbrium and system spontaneously evolutions to right equilibrium filling the whole volume

    Equilibrium by def. is the no variation of magnitudes on isolated systems. At the usual level of standard SM, that is compatible with maximum entropy. But the inverse is not always true.

    On those “rare” moments the system is outside of equilibrium because the fluctuation put it outside of equilibrium.

    dextercioby

    I am sorry to say this but regarding fluctuations you are rather wrong (even assuming that the TL had some rigorous mathematical basis).

    Fluctuations do not violate Boltzmann's H theorem. This is a very usual misunderstanding of Bolztmann H-theorem that one find in literature.

    For a more rigorous treatment of this topic, you can see my previous preprint (search in Google physchem/0109003), my analysis of the situation and references cited therein (specially references 11, 12, and 13), or you can also consult an improved version that will be freely available on my web in some days (www.canonicalscience.com)

    You said that “are extremely improbable.But the probability is nevertheless nonzero.” This is totally false; fluctuations are totally common and verifiable experimentally with basic laboratory material. One can compute the size of fluctuation in temperature using Einstein well-known formula. A simple two decimal digits thermometer or a commercial electronic ph-meter are sufficient for detecting fluctuation in Temperature or in Concentrations even in macroscopic matter.

    Fluctuations in gas systems of around 100 molecules are still more abrupt.


    JesseM

    Only a note for you the “principle of ergodicity” (sometime called ergodic hypothesis or other names) is only applicable to equilibrium states. In other approaches, it is a theorem derived just at equilibrium states.

    The equilbrium ensembles that you cited are the “microscopic representations” of the state of macroscopic equilibrium. The ensemble corresponding to usual macroscopic equilibrium state of equilibrium thermodynamics that appears in textbooks is the microcanonical one. The macrocanonical is for non-isolated sytems interacting with a heat bath and the grand canonical for open systems. The three are for equilibrium situations but only the first is for closed systems and have direct link with usual thermodynamics for closed systems.

    The slides are rather confusing and perhaps wrong. The second slide, e.g it appears to say that in equilibrium one can find also microstates with small number of microstates (i.e. nonequilibrium ones). This is not true, since that those states would be not equilibrium states. At equilibrium one find just eq.

    For example macrostate 1 and 2 in slide 1 are not equilibrium states (assuming that one can define rigorously eq. for a system of 4 particles). In that case macrostates 1 and 2 are accessible for the system but would be seen like fluctuations outside of the macroscopic equilibrium. I am preparing an article in the topic from a new formulation more general (valid also at nonequilibrium states or mesoscopic regimes), and rather rigorous omitting that class of misunderstanding.

    If the number of particles N was 1000, the equilibrium macrostate would have nL=500. Rigorously the equilibrium microstates are those setting nL=500. any other value (e.g. nL=200) would be a spontaneous fluctuation moving the system outside of eq. state.
     
    Last edited: May 19, 2005
  14. May 19, 2005 #13

    JesseM

    User Avatar
    Science Advisor

    Thanks, I'll look into these. One other quick question--you said the probability density function would be time-independent, which would mean the probability of finding the system in any microstate would be constant over time. So suppose we only have three microstates a, b, and c, and P(a) is the time-independent probability of finding the system in microstate a, P(b) the probability of finding it in b, P(c) of finding it in c. Also say that if you examine it and find it in a, and then after some time t examine it again, the probability it will then be found in b is P(a -> b), and so forth for the other transitions. Given this, would the time-independent condition imply the following must be true:

    P(a)*P(a -> a) + P(b)*P(b -> a) + P(c)*P(c -> a) = P(a)

    and likewise

    P(a)*P(a -> b) + P(b)*P(b -> b) + P(c)*P(c -> b) = P(b)

    and

    P(a)*P(a -> c) + P(b)*P(b -> c) + P(c)*P(c -> c) = P(c)

    ?
     
  15. May 19, 2005 #14

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    "So the more or less "sponteous" separations of the molecules (which would violate Boltzmann's H theorem) are extremely improbable.But the probability is nevertheless nonzero."

    Nope,there's my quote above yours.I didn't say what u claim i've said,so,please,don't put words into my mouth.I might choke... :rolleyes:

    You've obviously haven't read my post

    Here's my quote

    "Funny you mention fluctuations.You know that for equilibrium ensembles,they go to zero in the thermodynamical limit for all obervable quantities."

    It's correct. :approve:




    Great,i think the traditional approach to equilibrium SM is crap. :yuck: :tongue: The axiomatical one rules. :approve:


    Daniel.
     
  16. May 19, 2005 #15

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    More or less spontaneous separations of molecules in a closed system would definitely lower the entropy and enter conflict with H theorem.Howevernthey are possible solutions to Poincaré's theorem.But,i said,highly improbable.

    These separations (which in a way remind of Maxwell's demon) are not fluctuations.I know very well what fluctuations are.

    Daniel.
     
  17. May 19, 2005 #16
    I think that i quote correctly

    I'm soory but I thought that I had quote to you correctly in my previous post.

    1) There is no violation of H theorem (as already explained).

    2) In the "thermodynamic limit", that is, constant concentration and large systems ignoring superfittial effects of the order of (1/N), the fluctuations are not extremely improbable. It are very probable, in fact we can measure it without problems even in 1000 L of water. For small systems (e.g. 1000 molecules) fluctuations are still more heavy and easily detectable.

    That was my point.
     
    Last edited: May 19, 2005
  18. May 19, 2005 #17
    regarding the "axiomatic" formulation, it does not work better than traditional SM approach
     
  19. May 19, 2005 #18
    The lower of entropy by "more or less" spontaneous separation of molecules in closed system does not violate H-theorem. They are not highly improbable, can be computed from ratio of trajectories and measured in laboratory.

    Moreover, could you say that is a fluctuation for eviting possible misunderstanding for me part of your words.
     
  20. May 19, 2005 #19

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    If in an isolated closed system the entropy decreases by the formation of "clusters" of molecules,then H theorem's violated.

    I didn't say the reverse.

    It doesn't,but definitely it's more elegant.Incidentally,as F.Schwabl[1] shows,it's more correct to use Boltzmann's formula instead of Gibbs' for nonequilibirum system.But for equilibrium SM,the two formulas describe the same physics,but on different ways.

    Daniel.

    ----------------------------------------------------------
    [1]F.Schwabl:"Statistical Mechanics",Springer Verlag.
     
  21. May 19, 2005 #20

    On any isolated closed system, any entropy decreasing does not violate the H-theorem. See references quoted above.

    Perhaps is a bit more elegant but I do not see that was more correct to use Boltzmann's formula instead of Gibbs' for nonequilibirum system. Both agree, if you refer to entropy formulas.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Does equilibrium imply max. entropy in statistical mechanics?
  1. Statistical entropy (Replies: 8)

Loading...