Entropy and partition function

Click For Summary

Discussion Overview

The discussion revolves around the relationship between entropy and the partition function in statistical mechanics, specifically exploring how to derive the entropy formula from the Boltzmann distribution and the connection between different ensembles. Participants examine various definitions and mathematical manipulations related to entropy, microstates, and the partition function.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants propose that the relation S = log Z + ⟨U⟩/T can be derived using the VN entropy definition S = -Σ p_i log p_i.
  • Others question how to justify the formula s = -log(e^{-\epsilon β}/Z) and seek connections to the definition S = log Ω, where Ω is the number of accessible microstates.
  • One participant mentions that ε represents the energy of a particular state and discusses the implications of this in relation to the average energy ⟨U⟩.
  • Another participant shares their calculations, suggesting that S/k = Z + Σ{βE exp(-βE)/Z} and expresses uncertainty about a mathematical trick that may simplify the derivation.
  • One participant describes a method involving the partition function Z expressed as an integral and suggests using the steepest descent method to derive relations in the thermodynamic limit.
  • A later reply discusses the Helmholtz free energy F and its relation to entropy, providing a formula for S in terms of F and Z.

Areas of Agreement / Disagreement

Participants express various viewpoints and approaches without reaching a consensus. There are multiple competing models and methods discussed, and the discussion remains unresolved regarding the derivation and justification of the entropy formulas.

Contextual Notes

Some limitations include the dependence on definitions of entropy and microstates, as well as unresolved mathematical steps in the derivations presented. The discussion also highlights the complexity of transitioning between different statistical ensembles.

Euclid
Messages
213
Reaction score
0
Is it possible to obtain the relation
[tex]S = \log Z + \langle U \rangle /T[/tex]
directly from the Boltzmann distribution?

Edit: It seems that we can if we use the VN entropy:
[tex]S = -\Sigma p_i \log p_i[/tex]
This suggests that the entropy of a single microstate should be
[tex]s = -\log( \frac{e^{-\epsilon \beta}}{Z})[/tex]

Is there some way to justify this last formula?
 
Last edited:
Science news on Phys.org
Using log rules the last line becomes
ln(Z) +epsilom.beta
the justifying the fact that epsilom /(kT) implies <U>/T
Does this help , is epsilom the energy of the particular state?
 
Yes epsilon is the energy of some particular state. My question is, is there some way to arrive at [tex]s = -\log (e^{-\epsilon \beta}/Z)[/tex] from the more familiar definition [tex]S = \log \Omega[/tex], where omega is the number of accessible microstates?
 
I just gave it a fair whack.
Got S/k=Z+sum{beta.E.exp(-beta.E)/Z}
Have done stat. mech. for about a year so I am a bit rusty. Does this help?
My experience is that somewhere along the way lies a small math trick that helps, just getting to know it is hard
 
NoobixCube said:
I just gave it a fair whack.
Got S/k=Z+sum{beta.E.exp(-beta.E)/Z}
Have done stat. mech. for about a year so I am a bit rusty. Does this help?
My experience is that somewhere along the way lies a small math trick that helps, just getting to know it is hard

Yes I have already been that far. I think you mean [tex]\log Z[/tex] in the first term.

In any event, I think I have figured this one out. In the microcanonical ensemble, the entropy of a state is given by [tex]\log N[/tex] where N is the number of accessible microstates. This can be rewritten as [tex]\log N = - \Sigma \log \left(\frac{1}{N}\right) = - \Sigma_i \log p_i[/tex]. The last line follows from the fundamental assumption that a system is equally likely to be in any of its accessible microstates. Thus, in the canonical ensemble, the appropriation generalization is the formula given above.

All I was trying to do here was to see explicitly the connection between entropy in these two ensembles.
 
You can just write the partition function as

[tex]Z = \int dE \Omega\left(E\right)\exp\left(-\beta E\right)[/tex]

Next, take the log of the integrand and expand around the maximum, so you write it as exp(expansion of log) (i.e. you use the the steepest descent method). You obtain the relation as the leading order term that becomes exact in the thermodynamic limit.
 
The thermodynamic potential for a system in thermal contact with a bath is Helmotz's free energy F:

dF=-SdT-pdV

so [tex]S=-\frac{\partial F}{\partial T}[/tex]

but [tex]F=\frac{-1}{\beta}lnZ[/tex]

[tex]S=klnZ+\frac{1}{\beta Z}\frac{\partial Z}{\partial T}[/tex]

and the last term is the mean value of energy / T.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K