How is the Variance in Particle Number Derived?

  • Context: Graduate 
  • Thread starter Thread starter cryptist
  • Start date Start date
  • Tags Tags
    Particle Variance
Click For Summary

Discussion Overview

The discussion centers around the derivation of the variance in particle number within the context of Fermi-Dirac statistics, specifically exploring the relationship between the mean particle number and the grand-canonical partition function. The scope includes theoretical aspects of statistical mechanics and mathematical reasoning related to these concepts.

Discussion Character

  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant references an equation from Wikipedia regarding Fermi-Dirac statistics and seeks clarification on its derivation.
  • Another participant suggests expressing the mean particle number in terms of the grand-canonical partition function.
  • A third participant outlines the derivation of the variance in particle number, starting with the definition of the variance and relating it to the grand-canonical partition function.
  • This participant provides formulas for the mean particle number and its square, indicating that these can be derived from the grand-canonical partition function.
  • They emphasize the importance of taking derivatives at constant volume and temperature for the derivations to hold.
  • A later reply expresses gratitude for the provided explanation and formulas.

Areas of Agreement / Disagreement

The discussion does not appear to have any explicit disagreements, but it does not reach a consensus on the derivation process, as participants are still exploring the topic and providing different aspects of the derivation.

Contextual Notes

There are limitations regarding the assumptions made about the conditions under which the derivatives are taken, as well as the dependence on the definitions of the grand-canonical partition function and the Gibbs factors.

cryptist
Messages
121
Reaction score
1
I saw an equation on wikipedia: (http://en.wikipedia.org/wiki/Fermi-Dirac_statistics)

b17305ed57f454a2e9b8a2ef47dd3f82.png


Does anybody know how this is derived?
 
Physics news on Phys.org
Express the mean partice number in therms of the grandcanonical partition function.
 
[itex]\langle N \rangle[/itex] is the Fermi-Dirac distribution, which is derived on that wikipedia page. So, you can perform the derivative yourself and verify the second equality.

The first equality can be derived as follows. First,

[itex]\displaystyle \langle ( \Delta N )^2 \rangle = \langle (N - \langle N \rangle )^2 \rangle = \langle N^2 - 2 \langle N \rangle N + \langle N \rangle^2 \rangle = \langle N^2 \rangle - \langle N \rangle^2[/itex].

Next, at constant volume and temperature, the grand-canonical partition function is given as the sum over all states, [itex]s[/itex] of the Gibbs factors, [itex]e^{- (e_s - \mu n_s ) / k_B T}[/itex] (check out the wikipedia page on partition function if this is unfamiliar):

[itex]\displaystyle Z = \sum_s e^{- (e_s - \mu n_s )/ k_B T}[/itex].

Here [itex]e_s[/itex] and [itex]n_s[/itex] are the state energy and occupation number. The Gibbs factor of a state measures the relative probability that that state is occupied. Hence, by definition,

[itex]\displaystyle \langle N \rangle = \frac{\sum_s n_s e^{-(e_s - \mu n_s)/k_B T}}{\sum_s e^{- (e_s - \mu n_s ) / k_B T}}[/itex].

Given these formulas for [itex]Z[/itex] and [itex]\langle N \rangle[/itex], you should be able to show that

[itex]\displaystyle \langle N \rangle = k_B T \frac{1}{Z} \frac{dZ}{d \mu}[/itex].

I've taken it for granted that the derivatives are taken at constant volume and temperature.

Analogously, show that

[itex]\displaystyle \langle N^2 \rangle = (k_B T)^2 \frac{1}{Z} \frac{d^2 Z}{d \mu^2}[/itex].

Okay. At this point, I think you have all of the formulas you need to derive the first equality that you quoted from wikipedia. Just a little bit of ingenuity left. Good luck.
 
Thank you for the answer!
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
5K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 10 ·
Replies
10
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K