# Usage of statistical variables

1. Apr 28, 2007

### Manchot

My thermal and statistical mechanics class has been using Kittel and Kroemer's Thermal Physics as a textbook, and though it's an okay book, I find the notation extremely frustrating. My main beef with it is that I'm never quite sure what the context of certain quantities is. (If you have the book, this is especially prevalent in Chapter 3, wherein a great deal of the important theory is derived.) For example, when deriving the partition function, they consider a system in thermal contact with a reservoir at temperature tau. Next, from the partition function, they define the thermal energy U as the expectation value of the energy of the system eigenstates. Okay, I'm fine with that: Z and U are functions of the states of the system and of the temperature of the reservoir in which the system is in contact.

After this, I tend to get lost, mainly due to their usage of entropy. When they derive the thermodynamic identity, they start by saying that entropy is a function of U and of the volume V. But what exactly do they mean by "entropy?" When a system is in contact with a reservoir at a certain temperature, the only entropy / degeneracy that can be easily defined is that of the combined system. If I calculate that, I get:

$$\sigma = \ln(g(E))$$
$$= \ln(\sum{g_S(E_S)g_R(E_R)})$$
$$= \ln(\sum{g_S(E_S)e^{\sigma_R(E_{tot})-E_S/\tau}})$$
$$= \sigma_R(E_{tot}) + \ln(\sum{g_S(E_S)e^{-E_S/\tau}})$$
$$= \sigma_R(E_{tot}) + \ln(Z)$$

The ln(Z) part is obviously correct, but this result doesn't have the U/tau factor that you'd get if you calculated it from the partition function. So, where does the discrepancy come from?

Last edited: Apr 28, 2007
2. Apr 28, 2007

### arunma

Heh, I used Kittel and Kroemer back in college too. Personally I think you're very generous; I thought it was a horrible book. It's not even that useful as a reference, precisely because equations are never given in a clear context. I'm using this book to study for the thermal part of my PhD qualifier, and it's utterly miserable!

Anyway, I'll check their derivation of the partition function, and see if I can help you.

3. Apr 28, 2007

### Manchot

Yeah, in thinking about it more, I don't even trust their derivation for the partition function. If you fix the temperature of the reservoir, it would seem to me that you're also fixing the state of the system to be a certain value. The entropy of the reservoir is a function of energy, and so is its derivative with respect to energy, and therefore its temperature. Assuming that the relationship is one-to-one, then the temperature is a function of energy.

4. May 1, 2007

### Manchot

So, does anyone have any insight?

5. May 1, 2007

### StatMechGuy

I don't even know where that derivation is coming from. The typical derivation I've seen along those lines involves a Taylor expansion where you define the entropy as
$$S = \ln \left ( \Omega(N,E) \right)$$
where Omega is the number of microstates. You can then break down the energy into the energy of the reservoir and the energy of your microsystem, and Taylor expand on the assumption that the energy of your microsystem is small.

This is one derivation of the partition function, and the physics behind it is that your particular ensemble is "small" compared to the reservoir it's in contact with. Another personal favorite derivation is to maximize the entropy subject to the constraint of fixed energy; I like to do it from the density operator in quantum mechanics. Then temperature drops out as a Lagrange multiplier. The primary problem with that derivation is that it assumes the entropy maximum principle.

The truth is, there is no "rigorous" derivation of the partition function beyond Taylor expansion and saying that one thing is "large" compared to another. Completely rigorous statistical mechanics is done from the microcanonical ensemble, but working with that thing for too long can actually cause brain damage.