Annoying things in statistical mechanics

1. Sep 4, 2009

Kcant

I've been refreshing myself on some of the statistical mechanics I learned a couple years ago, using Kittel and Kroemer as a guide. However, I've come across a couple things that bother me:

1. When the Boltzmann distribution is derived, no real physics enters the picture. Essentially, the Taylor expansion of the entropy function is used to find the relative probability that two states are occupied. But entropy is just the logarithm of the degeneracy function, so why not just Taylor-expand the degeneracy function itself? What makes entropy special? I know that the degeneracy function is generally a fast-varying function of energy, and that entropy varies much more smoothly, but how can you know this a priori? How do you know that entropy varies sufficiently slowly to accurately approximate probabilities, and that you don't need some higher-order logarithm?

2. When deriving the Fermi-Dirac statistics, why can the energy of an unoccupied state be taken to be zero? Don't you really need some kind of field theory to know that?

2. Sep 6, 2009

Monocles

1. Entropy is defined as the logarithm of the multiplicity function specifically because it makes the math much easier to work with. If you wanted to, you could define the multiplicity function as entropy, and while the math would change, the physical results would be identical. So, it's basically a matter of convenience. Because the mapping between the multiplicity function and the logarithm of the multiplicity function is bijective, you aren't changing the physics any - just the ease of doing the math.

2. I don't know the answer to this question :( (I'm taking stat mech this semester and we haven't gotten to that yet)