1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

H theorem: equilibrium in statistical mechanics

  1. Aug 20, 2009 #1
    Hi guys, I am reading Reif's book on statistical mechanics and have a question on the H theorem. In section 2.3, Reif gives (on page 54) both the definition of equilibrium as well as a fundamental postulate.

    Definition: "An equilibrium situation is characterized by the fact that the probability of finding the system in any one state is independent of tiem (i.e., the representative ensemble is the same irrespective of time). All macroscopic parameters describing the isolated system are then also time-independent."

    Fundamental Postulate: "An isolated system in equilibrium is equally likely to be in any of its accessible states."

    A few pages later Reif talks about how nonequilibrium situations tend to approach equilibrium, and then refers me to his appendix A.12 on the H theorem (pages 624-626). His arguments are essentially those on this wikipedia page (read up to deltaS>=0):


    Here is the question: look at the forumula for dS/dt before deltaS>=0. It appears as if dS/dt is equal to zero if and only if (correct?) Pa = Pb for all a and all b, since each term in the double summation contributes a positive value for dS/dt unless the probabilities are equal. It therefore seems as if one could prove the fundamental postulate I listed above as follows: if the system is isolated and in equilibrium, then the probabilities Pi do not change in time by virture of Reif's definition of equilibrium. Thus, in particular, H (or S = -kH) does not change in time. However, dH/dt (or dS/dt) is zero if and only if all the probabilities are equal, and, therefore, we may conclude that isolation and equilibrium implies that the system is equally likely to be in its possible states. This "proof" cannot be correct, for Reif emphasized that the fundamental postulate cannot be proved; it is an axiom on which statistical mechanics is based. Thus, how is the above incorrect? Where did I go wrong?

    Thanks in advance for all your help. :smile: (By the way, how do you pronounce the author's name?)
  2. jcsd
  3. Aug 21, 2009 #2
    I agree.
    This equation:


    implies the probabilities are all the same at equilibrium.

    Note that this appears then as a consequence of the Fermi's golden rule.
    This also implies the existence of continuum states, like em radiation, and this brings us back close to the beginning of the story: Max Planck.

    Have you had some nice reading about a link between continuum states and the second principle?
  4. Aug 21, 2009 #3
    Second comment: what about the classical equivalent?
    I learned that by cutting the phase space in small hbar pieces.
    I was very unsatisfied by that for two reasons:

    - I never understood clearly the equiprobability in this case
    - I was embarrassed by the reference to QM: could the H-theorem not be derived without ?

    Any hint?
  5. Aug 21, 2009 #4
    Thre is no assumption of continuum of states here at all. What you need to assume is that the probability per unit time for a system to make a transition from state |i> to |j>, W_{i,j}, is symmetrical::

    W_{i,j} = W_{j,i}
  6. Aug 21, 2009 #5
    The reason why you can't use the H-theorem to prove te equal prior probability postulate, is simply because the H-theorem makes an assumption which is known to be false.

    To see that something must be wrong, recall that the transition probabilities are considered to be symmetrical, yet the H-theorem says that the entropy will increase in time ir stay the same. This then suggests that the entropy shoud stay constant and that's indeed what a more precise analysis shows: Due to unitary time evolution, entropy stays equal to zero at all times.

    To regain the nonzero entropy as we use it in thermodynamics, requires one to perform a coarse graining procedure. Tpo see that thois increases, one has to assume some low entropy initial conditions to get an increasing entropy. The fact that entropy increases is then due to the assumed conspirational initial conditions.
  7. Aug 21, 2009 #6
    Isn't the coarse graining procedure not somehow included / hidden in the Fermi's Golden rule?
    The question of the unitary evolution appears for me to be the same in the second law, the H-theorem, or the Fermi's Golden rule.

    As a student I was really shocked by the Fermi's Golden rule precisely because it replaces an assumed unitary evolution by something which is not unitary anymore. And I keep silent about other difficulties like the recourse to a "perturbation" that nature obviously ignores. I also avoid relating that topic to the projection postulate of QM, to avoid more useless headaches.

    I think that the H-theorem as well as the Fermi's Golden rule are difficult to teach and even more to learn. It doesn't harm to forget about the subtilities. After all, after long considerations and more subtile maths, one end up to the same resut.
  8. Aug 21, 2009 #7
    In case of Fermi's Golden Rule, you use the formula for the transition probability and then consider a transition to a group of final states in the infinite volume limit (in that limit there is a continuum of states) and consider the limit t --> infinity.

    Because of these limits you can ignore coherences between the different final states (so, you can integrate the squared absolute values of the amolitudes). You will also not see the system evolving back to the initial state.
  9. Aug 21, 2009 #8
    Thanks guys for the responses.

    Count Iblis:

    I think I understood what you said, but let me make sure I really have it. The transition probabilities W{s,r} and W{r,s} are due to perturbations from a small perturbing Hamiltonian H_1, and they are written as (in quantum bra-ket notation) |<r|H_1|s>|^2 or |<s|H_1|r>|^2. If there is no perturbing Hamiltonian H_1, then if the system is in an exact energy eigenstate |E>, then it will, as you said, stay in it due to the unitary time evolution.

    Now since the system is isolated (remember the wording of the fundamental postulate of statistical mechanics), there can not be any perturbing Hamiltonians H_1, just the main one; let's call it H. Therefore, we can not speak of a transition from one energy eigenstate to another, because such a thing does not happen. That is why Fermi's golden rule does not work in this case.

    This begs the question: if the system is not in the exact quantum eigenstates, then how are we supposed to describe systems in statistical mechanics? The fundamental postulate speaks of probabilities; what, then, are these probabilities? I will show you in the next two paragraphs an idea I had for how we're supposed to describe systems in statistical mechanics. Is it correct? Please let me know.

    As you well said in your second post, we take in statistical mechanics never an exact energy, but a range between E and E + dE, where dE is macroscopically small but microscopically large (large!). How would we describe the system? Here's how I believe it would be done: we take all the possible wavefunctions |s> of the system, where the mean (quantum mechanical mean) energy of each state, that is, <s|H|s>, is between E and E + dE. All these states will be in our statistical ensemble.

    Now, there are many, many energy eigenstates between E and E + dE; call them |Ej>. In general, the quantum mechanical probability |<Ej|s>|^2 is different for different states |s>. But, if we average all of the quantum mechanical probabilities accross all the different states |s> that fit in the energy range, we can obtain the various statistical mechanics probabilities Pj, and these would be the probabilities the fundamental postulate is referring to. Is this right? Is this how we do statistical mechanics?

    Finally, I'm sorry to say, while I learned quantum mechanics at the level of Shankar's book, I'm just a beginner at thermal physics in general. Thus, I didn't really understand both lalbatros's and Count Iblis's last posts. Are they relavant to what I just posted above? Like I said, now that I have seen why Fermi's golden rule does not apply here, I'm only interested in figuring out how states are done in statistical mechanics.

    Thank you.
  10. Aug 22, 2009 #9
    Yes, what you do is consider a big ensemble of systems that would all be in the same macrostate but whose exact quantum state are different. The assumption is then that averaging over this ensemble will yield the correct answer for macroscopic observables of an individual system.

    If you have some individual member of the ensemble and measure e.g. the pressure, then that measurement can be formulated as performing a complicated observation of the many particle system that involves performing a time average. So, the assumption is actually that if you take an individual member of the ensemble and perform some averaging over time of an observable, then you can replace that chaotic time averaging by averaging over the entire ensemble. So, the time averaging can be seen as a sort of taking a poll of the entire ensemble (or vice versa).

    Now, an ensemble of systems that are in exact eigenstates would look time independent. But you know that you can formulate the expectation value of an observable as a trace over states and then you can evaluate that in any basis.

    The space spanned by energy eigenstates within the energy interval of dE is large enough to contain states that you can picture in a semi-classical way as containing molecules that move around.
  11. Aug 22, 2009 #10

    Clearly the H-theorem is a consequence of the Fermi's Golden Rule.
    I also think indeed that ignoring coherences is the quantum way for coarse graining.
  12. Aug 22, 2009 #11
    Okay, so you're saying that statistical mechanics is done the following way:

    1. Specify the macrostate of the ensemble of systems. This includes the dimensions of the cylinder you contain the gas in (e.g., the volume). It may also include external electromagnetic fields. Finally, it must contain the energy E, measured to a macroscopic accuracy of dE.
    2. Take a large ensemble, and measure certain values. You could, as you said, measure the average pressure by performing complicated observations on all the systems. Or, you could measure the exact energy and deterimine the average probability of being in an exact eigenstate (called in statistical mechanics by the name microstate). This statistical mechanical probability Pj is done by averaging out all the |<Ej|s>|^2. According to the fundamental postulate, if one starts with a system in equilibrium, then the values for all the average probabilities Pj should come out the same if we waited 10 seconds to take the measruements, or 20 seconds to take the measurements. But if the system is not in equilibrium, then we will get different and unequal values for the average probabilities if we waited 10 or 20 seconds. Of course, as mentioned above we cannot use the H theorem to prove the fundamental postulate, since the H theorem speaks of perturbations and transition probabilities, which is nonsense. It's applying a rule to the wrong situation.

    This leaves only one question, and then I think I understood it: Count Iblis, you mentioned that we may either do a time averaging of the measurement or use a large ensemble and perform one measurement on each. How can the time averaging work? Doesn't a quantum mechanical measruement change the state of the system? Thus, don't we have to use an average over systems in an ensemble rather than one system over time? If we keep on measuring the value for pressure for a single system, we're going to get the same number again, again, again... and thus we won't have a good average value. Correct?
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: H theorem: equilibrium in statistical mechanics
  1. Statistical mechanics (Replies: 1)

  2. Statistical mechanics (Replies: 2)