Expressions for energy & entropy from free energy (discrete distribution)

Click For Summary
SUMMARY

The discussion focuses on deriving expressions for energy and entropy from free energy in a discrete distribution context using the partition function. The partition function is defined as $$ Z = \sum_i e^{ \frac {-E_i} {k_BT} } $$, leading to the free energy expression $$ F = -k_B T \ln{Z} $$, specifically $$ F = -k_B T \ln{(1 + e^{ \frac {-\epsilon} {k_BT} })} $$. Participants clarify that the average energy $$ \langle E \rangle $$ can be calculated using probabilities associated with energy states, and they explore the relationship between free energy, internal energy, and entropy through thermodynamic relations.

PREREQUISITES
  • Understanding of statistical mechanics concepts, particularly the partition function.
  • Familiarity with thermodynamic relations, specifically $$ F = U - TS $$.
  • Knowledge of Boltzmann statistics and the Boltzmann factor.
  • Basic calculus for differentiation in thermodynamic equations.
NEXT STEPS
  • Study the derivation of the partition function in statistical mechanics.
  • Learn how to apply the thermodynamic relation $$ S = -\frac {\partial F}{\partial T} $$ to find entropy.
  • Investigate the relationship between internal energy and the partition function using $$ U = - \frac {\partial \ln Z}{\partial \beta} $$.
  • Explore ensemble averages and their significance in statistical mechanics.
USEFUL FOR

This discussion is beneficial for physics students, researchers in statistical mechanics, and anyone interested in the thermodynamic properties of discrete systems.

baseballfan_ny
Messages
92
Reaction score
23
Homework Statement
(a) Find an expression for the free energy as a function of ##\tau## (temperature) of a system with two states, one at energy 0 and one at energy ##\epsilon##. (b) From the free energy, find expressions for energy and entropy of the system. The entropy is plotted in Figure 3.11 (see below).
Relevant Equations
$$ Z = \sum_i e^{ frac {-E_i} {k_BT} } $$
$$ F = -k_B T \ln{Z} $$
$$ F = U - TS $$
1631618818187.png


So just by by using the definition of the partition function...
$$ Z = \sum_i e^{ \frac {-E_i} {k_BT} } = e^{ \frac {-0} {k_BT} } + e^{ \frac {-\epsilon} {k_BT} } = 1 + e^{ \frac {-\epsilon} {k_BT} } $$

And then, a result we obtained in class by using the Boltzmann H factor to solve for ##S## and finding an expression for ## U - TS## gave us ##F = -k_B T \ln{Z} ##. So applying that gives part (a)...

$$ F = -k_B T \ln{ 1 + e^{ \frac {-\epsilon} {k_BT} } } $$

Now for (b), I'm confused. I'm not sure if I have an equation to get S and U out of just this free energy expression. All I have is the free energy definition, ## F = U - TS ##, and if I try to solve using that and my above expression for free energy, I'm going to get something like ## U = U ## or ## S = S##. So I'm not sure which equation/principle to use. I have thought about using differential with constant volume but I don't think that would help and the question seems pretty explicit about using the free energy expression from (a) to obtain both U and S.
 
Physics news on Phys.org
Think of ##U## as ##\langle E \rangle##. For the two-state system, how would you calculate ##\langle E \rangle##? Hint: What is the probability that the system has energy 0? What is the probability the system has energy ##\epsilon##?
 
TSny said:
Think of ##U## as ##\langle E \rangle##. For the two-state system, how would you calculate ##\langle E \rangle##? Hint: What is the probability that the system has energy 0? What is the probability the system has energy ##\epsilon##?

Oh ok. So ...
$$ \langle E \rangle = \sum_{E = 0, E = \epsilon} E*P(E) = \frac {e^{\frac {0} {k_BT}}} {Z}*0 + \frac {e^{\frac {-\epsilon} {k_BT}}} {Z} *\epsilon $$

$$ \langle E \rangle = \frac {e^{\frac {-\epsilon} {k_BT} } } {1 + {e^{\frac {-\epsilon} {k_BT} } } *\epsilon $$

Is this the expression for "the energy of the system" that (b) is asking me for? So then I plug this into ## F = U - TS## to solve for entropy?
 
Yes
 
  • Like
Likes   Reactions: baseballfan_ny
That's interesting, I would have that by "energy of the system," they mean some sort of general expression, like energy as a function of the state. But it actually refers to an ensemble average of energy. Is there a reason for that? My guess is it has something to do with the system being discrete.
 
baseballfan_ny said:
That's interesting, I would have that by "energy of the system," they mean some sort of general expression, like energy as a function of the state. But it actually refers to an ensemble average of energy. Is there a reason for that? My guess is it has something to do with the system being discrete.
I'm not sure I understand what you are asking here. Taking the thermodynamic ##U## to correspond to the statistical mechanics ##\langle E \rangle## is standard, I believe.

Another approach to the problem is to first find ##S## using the thermodynamic relation ##S = -\frac {\partial F}{\partial T}##. Then you can find ##U## from ##F = U - TS##.

Or, maybe you have seen the relation ##U= - \large \frac {\partial \ln Z}{\partial \beta}##, where ##\beta = \large \frac 1 {kT}##. This can be used to find ##U##.
 
  • Like
Likes   Reactions: baseballfan_ny
TSny said:
Taking the thermodynamic U to correspond to the statistical mechanics ⟨E⟩ is standard, I believe.
That's exactly what I was asking :) Now that I think about, I think our instructor had mentioned it once briefly during class. Also got the same answer from trying the other two methods. Thanks for the help.
 
  • Like
Likes   Reactions: TSny

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
4
Views
714
Replies
1
Views
2K
Replies
4
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
Replies
7
Views
3K