Entropy and partition function

AI Thread Summary
The discussion centers on deriving the relation S = log Z + ⟨U⟩/T from the Boltzmann distribution. It is established that using the VN entropy, S = -Σ p_i log p_i, allows for this derivation, with ε representing the energy of a specific state. The connection between entropy in the microcanonical and canonical ensembles is explored, emphasizing that the canonical ensemble's entropy can be expressed in terms of the partition function Z. The method of steepest descent is suggested for expanding the partition function, leading to a relationship that holds in the thermodynamic limit. Ultimately, the discussion clarifies the mathematical connections between entropy, the partition function, and energy in statistical mechanics.
Euclid
Messages
213
Reaction score
0
Is it possible to obtain the relation
S = \log Z + \langle U \rangle /T
directly from the Boltzmann distribution?

Edit: It seems that we can if we use the VN entropy:
S = -\Sigma p_i \log p_i
This suggests that the entropy of a single microstate should be
s = -\log( \frac{e^{-\epsilon \beta}}{Z})

Is there some way to justify this last formula?
 
Last edited:
Science news on Phys.org
Using log rules the last line becomes
ln(Z) +epsilom.beta
the justifying the fact that epsilom /(kT) implies <U>/T
Does this help , is epsilom the energy of the particular state?
 
Yes epsilon is the energy of some particular state. My question is, is there some way to arrive at s = -\log (e^{-\epsilon \beta}/Z) from the more familiar definition S = \log \Omega, where omega is the number of accessible microstates?
 
I just gave it a fair whack.
Got S/k=Z+sum{beta.E.exp(-beta.E)/Z}
Have done stat. mech. for about a year so I am a bit rusty. Does this help?
My experience is that somewhere along the way lies a small math trick that helps, just getting to know it is hard
 
NoobixCube said:
I just gave it a fair whack.
Got S/k=Z+sum{beta.E.exp(-beta.E)/Z}
Have done stat. mech. for about a year so I am a bit rusty. Does this help?
My experience is that somewhere along the way lies a small math trick that helps, just getting to know it is hard

Yes I have already been that far. I think you mean \log Z in the first term.

In any event, I think I have figured this one out. In the microcanonical ensemble, the entropy of a state is given by \log N where N is the number of accessible microstates. This can be rewritten as \log N = - \Sigma \log \left(\frac{1}{N}\right) = - \Sigma_i \log p_i. The last line follows from the fundamental assumption that a system is equally likely to be in any of its accessible microstates. Thus, in the canonical ensemble, the appropriation generalization is the formula given above.

All I was trying to do here was to see explicitly the connection between entropy in these two ensembles.
 
You can just write the partition function as

Z = \int dE \Omega\left(E\right)\exp\left(-\beta E\right)

Next, take the log of the integrand and expand around the maximum, so you write it as exp(expansion of log) (i.e. you use the the steepest descent method). You obtain the relation as the leading order term that becomes exact in the thermodynamic limit.
 
The thermodynamic potential for a system in thermal contact with a bath is Helmotz's free energy F:

dF=-SdT-pdV

so S=-\frac{\partial F}{\partial T}

but F=\frac{-1}{\beta}lnZ

S=klnZ+\frac{1}{\beta Z}\frac{\partial Z}{\partial T}

and the last term is the mean value of energy / T.
 
Back
Top