Mean energy using the partition function

Click For Summary

Homework Help Overview

The discussion revolves around a system of two energy levels, E_0 and E_1, populated by N particles at temperature T, focusing on deriving the average energy per particle using the partition function in a classical context.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the correct formulation of the partition function for N particles versus one particle, questioning the necessity of including the factorial term N! in the expression. There is also discussion about the implications of temperature limits on average energy calculations.

Discussion Status

Participants are actively engaging with each other's reasoning, clarifying misunderstandings about the partition function and its application. Some guidance has been offered regarding the treatment of indistinguishable particles and the implications of temperature changes on average energy.

Contextual Notes

There is an ongoing debate about the assumptions regarding particle indistinguishability and the relevance of Stirling's approximation in the context of large N. Participants express uncertainty about the completeness of the problem statement.

patrickmoloney
Messages
94
Reaction score
4

Homework Statement


System of two energy levels, E_0 and E_1 is populated by N particles, at
temperature T. The particles populate the levels according to the classical
(Maxwell-Boltzmann) distribution law.

(i) Write an expression for the average energy per particle.

Homework Equations

The Attempt at a Solution


The partition function of our system z=\sum_s{e^{-\beta E_s}}= e^{-\beta E_0}+ e^{-\beta E_1} where \beta = \frac{1}{kT}.
The probability of any number of the N particles being in either system is given by

P_0 = \frac{1}{z}e^{-\beta E_0} P_1 = \frac{1}{z}e^{-\beta E_1}

The average energy \overline{E} is

\overline{E}= -\frac{1}{z}\frac{\partial z}{\partial \beta} = \frac{E_0 +E_1}{e^{-\beta E_0}+ e^{-\beta E_1}}since \frac{\partial z}{\partial \beta }= \frac{\partial}{\partial \beta}(e^{-\beta E_0}+ e^{-\beta E_1}) =-(E_0 + E_1)

is this a correct method to the problem?
 
Physics news on Phys.org
You wrote down the partition function for one particle. You need the one for N particles.
 
vela said:
You wrote down the partition function for one particle. You need the one for N particles.
z= \sum_s{e^{-\beta E_s}} is that what you mean? and then just use this to find \overline E
 
Sorry, I get what you mean now! We haven't covered that in class yet so here goes:

Letting z_1= e^{-\beta E_0}+ e^{-\beta E_1}

Thus for N particles we can use
Z_N = \frac{(z_1)^N}{N!}

and plugging in our value for z_1 we get

Z_N = \frac{(e^{-\beta E_0}+ e^{-\beta E_1})^N}{N!}
 
patrickmoloney said:
Z_N = \frac{(e^{-\beta E_0}+ e^{-\beta E_1})^N}{N!}
Yes, this looks right. However it may be useful to write your definition for the average energy in post #1 in a different way
$$\left<E\right>=-\frac{\partial}{\partial\beta}\text{ln}(Z_{N})$$
This will allow you to make use of Stirling's approximation to get rid of the ##N!## and get rid of the ##N## power.
 
There's no need for Stirling's approximation since ##N!## is a constant.

Patrick, you should take another look at your expression for ##\partial z/\partial\beta## as well. You didn't differentiate correctly.
 
vela said:
There's no need for Stirling's approximation since ##N!## is a constant.

Patrick, you should take another look at your expression for ##\partial z/\partial\beta## as well. You didn't differentiate correctly.
I understand what you mean now! Z_{sp} = (Z_{sp})^N For distinguishable particles.
 
vela said:
There's no need for Stirling's approximation since ##N!## is a constant.
What does ##N## being constant have to do with Stirling's approximation? The only requirement is that ##N## be a very large number, which is often assumed in these types of problems.
patrickmoloney said:
I understand what you mean now! Z_{sp} = (Z_{sp})^N For distinguishable particles.
Even for a classical system the ##N!## is necessary in order for the entropy to be extensive.
 
NFuller said:
What does ##N## being constant have to do with Stirling's approximation? The only requirement is that ##N## be a very large number, which is often assumed in these types of problems.

Even for a classical system the ##N!## is necessary for the entropy to be extensive.
Does that mean I should always use Z_N = \dfrac{(Z_1)^N}{N!}
 
  • #10
patrickmoloney said:
Does that mean I should always use Z_N = \dfrac{(Z_1)^N}{N!}
Not always, but the problem never said the particles were not indistinguishable it only said it was a classical system. I feel like the problem could be more specific but my intuition is that you should keep the ##N!## since leaving it out leads to things like the Gibb's paradox.
 
  • Like
Likes   Reactions: patrickmoloney and Greg Bernhardt
  • #11
NFuller said:
What does ##N## being constant have to do with Stirling's approximation? The only requirement is that ##N## be a very large number, which is often assumed in these types of problems.
Nothing, but if the ##\log N!## term is going to disappear anyway when you differentiate, why bother approximating it in the first place?
 
  • #12
vela said:
Nothing, but if the ##\log N!## term is going to disappear anyway when you differentiate, why bother approximating it in the first place?
I see your point. However I would still argue in favor of including it for pedagogical reasons. If the particles are indistinguishable, writing ##Z## without the ##N!## term is incorrect; if the problem has a second part asking for the entropy or free energy, then the OP would be lead to the wrong answer.
 
  • #13
\langle E \, \rangle= -\dfrac{\partial \ln Z}{\partial \beta}= \Bigg {(}\dfrac{E_1 e^{-E_1 \beta}}{1 + e^{-E_1 \beta}}\Bigg{)}

if we let E_0 = 0 be our zero-energy level.
 
Last edited:
  • #14
patrickmoloney said:
\langle E \, \rangle= -\dfrac{\partial \ln Z}{\partial \beta}= \Bigg {(}\dfrac{E_1 e^{-E_1 \beta}}{1 + e^{-E_1 \beta}}\Bigg{)}

if we let E_0 = 0 be our zero-energy level.
This is the average energy of an individual particle. So if that is what you are looking for, then this is correct. For the average energy of the whole ensemble, you need a factor of ##N##.
$$\langle E \, \rangle= -\dfrac{\partial \ln Z_{N}}{\partial \beta}= \Bigg {(}\dfrac{NE_1 e^{-E_1 \beta}}{1 + e^{-E_1 \beta}}\Bigg{)}$$
 
  • #15
Yep it's just for one particle. What would happen to the average energy if the temperature T \to 0 and T \to \infty I assume that for T \to \infty we would get

\langle E \rangle = E_1

since \dfrac{E_1 e^{0}}{1 + e^{0}} = E_1. For \beta = \dfrac{1}{kT}
 
  • #16
Close, what is ##1+e^{0}##?
 
  • #17
NFuller said:
Close, what is ##1+e^{0}##?
Right! \dfrac{E_1}{2}

The other one is kind of confusing me. I think as T \to 0 we have \dfrac{-E_1}{kT} \ll 0. That's all I can kind of deduce from that
 
  • #18
As ##T\rightarrow0##, ##\beta\rightarrow\infty##. So what happens to ##e^{-\beta E_{1}}## and hence ##\langle E \rangle## in this limit?
 
  • #19
NFuller said:
As ##T\rightarrow0##, ##\beta\rightarrow\infty##. So what happens to ##e^{-\beta E_{1}}## and hence ##\langle E \rangle## in this limit?
e^{-E_1 \beta} = \dfrac{1}{e^{E_1 \beta}}

and \beta \to 0 \implies e^{-E_1 \beta} =0 But wouldn't that give us the same value \dfrac{E_1}{2}??
 
  • #20
patrickmoloney said:
\beta \to 0 \implies e^{-E_1 \beta} =0
I think you mean ##\beta \to \infty \implies e^{-E_1 \beta} =0##. With this in mind, what is ##E_{1}e^{-\beta E_{1}}## in this limit?
 
  • #21
NFuller said:
I think you mean ##\beta \to \infty \implies e^{-E_1 \beta} =0##. With this in mind, what is ##E_{1}e^{-\beta E_{1}}## in this limit?
Oh sorry yeah. That gives us \langle E \, \rangle= \dfrac{0}{1+0}=0

Thank you so much for that!
 
  • Like
Likes   Reactions: NFuller

Similar threads

Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
6K
Replies
4
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
2K