Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy and partition function

  1. Jan 24, 2008 #1
    Is it possible to obtain the relation
    [tex] S = \log Z + \langle U \rangle /T[/tex]
    directly from the Boltzmann distribution?

    Edit: It seems that we can if we use the VN entropy:
    [tex] S = -\Sigma p_i \log p_i [/tex]
    This suggests that the entropy of a single microstate should be
    [tex] s = -\log( \frac{e^{-\epsilon \beta}}{Z})[/tex]

    Is there some way to justify this last formula?
    Last edited: Jan 25, 2008
  2. jcsd
  3. Jan 25, 2008 #2
    Using log rules the last line becomes
    ln(Z) +epsilom.beta
    the justifying the fact that epsilom /(kT) implies <U>/T
    Does this help , is epsilom the energy of the particular state?
  4. Jan 25, 2008 #3
    Yes epsilon is the energy of some particular state. My question is, is there some way to arrive at [tex] s = -\log (e^{-\epsilon \beta}/Z)[/tex] from the more familiar definition [tex] S = \log \Omega[/tex], where omega is the number of accessible microstates?
  5. Jan 25, 2008 #4
    I just gave it a fair whack.
    Got S/k=Z+sum{beta.E.exp(-beta.E)/Z}
    Have done stat. mech. for about a year so I am a bit rusty. Does this help?
    My experience is that somewhere along the way lies a small math trick that helps, just getting to know it is hard
  6. Jan 26, 2008 #5
    Yes I have already been that far. I think you mean [tex]\log Z[/tex] in the first term.

    In any event, I think I have figured this one out. In the microcanonical ensemble, the entropy of a state is given by [tex] \log N[/tex] where N is the number of accessible microstates. This can be rewritten as [tex] \log N = - \Sigma \log \left(\frac{1}{N}\right) = - \Sigma_i \log p_i[/tex]. The last line follows from the fundamental assumption that a system is equally likely to be in any of its accessible microstates. Thus, in the canonical ensemble, the appropriation generalization is the formula given above.

    All I was trying to do here was to see explicitly the connection between entropy in these two ensembles.
  7. Jan 26, 2008 #6
    You can just write the partition function as

    [tex]Z = \int dE \Omega\left(E\right)\exp\left(-\beta E\right)[/tex]

    Next, take the log of the integrand and expand around the maximum, so you write it as exp(expansion of log) (i.e. you use the the steepest descent method). You obtain the relation as the leading order term that becomes exact in the thermodynamic limit.
  8. Jan 26, 2008 #7
    The thermodynamic potential for a system in thermal contact with a bath is Helmotz's free energy F:


    so [tex]S=-\frac{\partial F}{\partial T}[/tex]

    but [tex]F=\frac{-1}{\beta}lnZ[/tex]

    [tex]S=klnZ+\frac{1}{\beta Z}\frac{\partial Z}{\partial T}[/tex]

    and the last term is the mean value of energy / T.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Entropy and partition function
  1. Partition Functions (Replies: 1)

  2. Partition function (Replies: 2)