Annoying things in statistical mechanics

Click For Summary
SUMMARY

This discussion focuses on the complexities and nuances of statistical mechanics, specifically addressing the derivation of the Boltzmann distribution and Fermi-Dirac statistics. The participants highlight the reliance on the Taylor expansion of the entropy function to determine state probabilities and question the rationale behind using entropy instead of the degeneracy function. Additionally, the discussion raises concerns about the assumption that the energy of an unoccupied state can be considered zero in Fermi-Dirac statistics, suggesting a need for a deeper understanding of field theory in this context.

PREREQUISITES
  • Understanding of Boltzmann distribution and its derivation
  • Familiarity with entropy and degeneracy functions in statistical mechanics
  • Basic knowledge of Fermi-Dirac statistics
  • Concepts of Taylor expansion in mathematical physics
NEXT STEPS
  • Research the derivation of the Boltzmann distribution using Kittel and Kroemer's approach
  • Explore the relationship between entropy and degeneracy functions in statistical mechanics
  • Study Fermi-Dirac statistics and its implications in quantum mechanics
  • Investigate the role of field theory in statistical mechanics and its applications
USEFUL FOR

Students of statistical mechanics, physicists interested in thermodynamics, and researchers exploring quantum statistics will benefit from this discussion.

Kcant
Messages
2
Reaction score
0
I've been refreshing myself on some of the statistical mechanics I learned a couple years ago, using Kittel and Kroemer as a guide. However, I've come across a couple things that bother me:

1. When the Boltzmann distribution is derived, no real physics enters the picture. Essentially, the Taylor expansion of the entropy function is used to find the relative probability that two states are occupied. But entropy is just the logarithm of the degeneracy function, so why not just Taylor-expand the degeneracy function itself? What makes entropy special? I know that the degeneracy function is generally a fast-varying function of energy, and that entropy varies much more smoothly, but how can you know this a priori? How do you know that entropy varies sufficiently slowly to accurately approximate probabilities, and that you don't need some higher-order logarithm?

2. When deriving the Fermi-Dirac statistics, why can the energy of an unoccupied state be taken to be zero? Don't you really need some kind of field theory to know that?
 
Physics news on Phys.org
1. Entropy is defined as the logarithm of the multiplicity function specifically because it makes the math much easier to work with. If you wanted to, you could define the multiplicity function as entropy, and while the math would change, the physical results would be identical. So, it's basically a matter of convenience. Because the mapping between the multiplicity function and the logarithm of the multiplicity function is bijective, you aren't changing the physics any - just the ease of doing the math.

2. I don't know the answer to this question :( (I'm taking stat mech this semester and we haven't gotten to that yet)
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
Replies
4
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K