Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Derivation of the key principle statistical Mechanics

  1. Feb 19, 2008 #1
    I am looking for educational derivation( or any available and detailed) of the key principle of statistical mechanics:
    If a system in equilibrium can be in one of N states, then the probability of the system having an energy En is (1/Q) e-En/κT
    Q is the partition function.
    I have looked it up in some books, and did not find a real derivation; perhaps the derivation is too complicated? I am wondering if Boltzmann was the first to derive it, and if his derivation is correct, or it was revised since then?
  2. jcsd
  3. Feb 19, 2008 #2


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    A complete derivation can be found in McQuarrie's Statistical Mechanics, and probably in Hill's book too, which should be easier to find. It relies on the use of Lagrange multipliers to satisfy the particular constraints of an ensemble while determining the most likely distribution of states.

    Boltzmann was the first to explore this area for dilute gases, and the work was relatively easy because the gas molecules can be assumed to be independent. It was Gibbs, however, who came up with the idea on which stat mech is based. He envisioned a virtual ensemble of systems, all identical to the original system in bulk properties, but different in their arrangement of microstates, and he applied thermodynamics to the ensemble. Genius!
  4. Feb 19, 2008 #3
    thank you I will look it up
  5. Mar 13, 2008 #4
    A really nice reference is Callen (thermodynamics and an introduction to thermostatistics). Alas, quantum proofs are easier to draw than classical ones (have a look in Landau, Huang and Khinchine books, technical), but you really don't need QM (that is discrete states over the phase space) to derive this kind of result. Pauli stat mech book is also really excellent, small and straightforward.
  6. Mar 13, 2008 #5
    The proofs not that hard and I believe it can be found on Wikipedia. I’d be more interested in validity of the assumption that each state has an equal likelihood. How can we justify this and under what circumstances is it true?
  7. Mar 13, 2008 #6
    Is this assumption the only unproved one? i.e. that each state has the same liklyhood?
    I am still trying to understand the proof, it does not seem complete to me , I thoght that maybe it I am missing something that everybody sees?
  8. Mar 14, 2008 #7
    Well, it depends on the constraints that act on the physical system of interest. From a dynamical point of view, the basic assumption is that the system dynamics must be ergodic. Roughly speaking, it means that the phase space must be densely covered by the system trajectories. This kind of assumption cannot be demonstrated, as far as I know, and is clearly wrong in the case of nonequilibrium dynamics (transient states). Another kind of argument is entropic/information theory-like (entropy maximization with respect to the applied constraints, applied temperature, pressure, chemical potential, ...). Entropy maximization is one of the postulates of stat mech and thermodynamics, but it can also be demonstrated on the basis of the Boltzmann H-theorem (kinetic theory)... All this kind of **** is very well written in Pauli's book, "stat. mech.", Dover (see also Huang). My feeling is that looking at the web won't be a great idea for this difficult fundamental intricated subject... Actually, there is a lot of research on this subject, because nowadays nano-objects are more and more working out of equilibrium regimes, and equilibrium stat mech barely don't apply to them (see Gallavotti-Cohen fluctuation theorems, Jarzynski and Crooks equalities, Anosov systems, Ruelle SRB measures...).
    Last edited: Mar 14, 2008
  9. Mar 15, 2008 #8
  10. Mar 16, 2008 #9
    I have the following problems, which I feel are connected,
    1. It seems to me that the proof is for a set of distinguishable particles, and I did not see it developed for a set of indistinguishable particles.
    2. You can say that this is because it is classical and classical particles can be distinguishable, but from the other side, the probabilities have a quasi continuous spectrum, this is important for the proof. So we are looking at a very small volumes, (I was also wondering how small can we go) that must bring in the quantum effects, so are the P distinguishable in as small volume that we need?
    3. So perhaps what I am looking for is a proof that is true for quantum indistinguishable particles.
  11. Mar 16, 2008 #10
    Wikipedia mentions that Fermi-Dirac statistics deal with indistinguishable particles. As for how small you can go if you read the discussion part of the wikipedia page this is discussed somewhat. I believe there is higher order sterling approximations that are dealt with in modern physics books. Also if you are looking for more issues to question this is discussed to three posts back by SeniorTotor and here's his quote: "All this kind of **** is very well written in Pauli's book, "stat. mech.", Dover (see also Huang)."
    Last edited: Mar 16, 2008
  12. Mar 16, 2008 #11
    The Fermi Dirac statistics, is added on top of the classical statistics, I thought it should be possible to derive the formulae on the basis of indistinguishable particle to begin with, and not add it on as an empirical fact of the quantum mechanics.
  13. Mar 16, 2008 #12
    Why do you believe the derivation is empirical. I'll look at it to see if I can follow it. Where are you stuck?
  14. Apr 23, 2008 #13
    I looked in Feynman’s book, ‘statistical mechanics’, The derivation starts with a set of distinguishable particles, and then he continues to add the requirements of the different statistics.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Derivation of the key principle statistical Mechanics
  1. Statistical mechanics (Replies: 1)

  2. Statistical mechanics (Replies: 2)