Usage of statistical variables

In summary, the notation in Kittel and Kroemer's Thermal Physics is difficult to understand, and the derivation of the partition function is not reliable.
  • #1
Manchot
473
4
My thermal and statistical mechanics class has been using Kittel and Kroemer's Thermal Physics as a textbook, and though it's an okay book, I find the notation extremely frustrating. My main beef with it is that I'm never quite sure what the context of certain quantities is. (If you have the book, this is especially prevalent in Chapter 3, wherein a great deal of the important theory is derived.) For example, when deriving the partition function, they consider a system in thermal contact with a reservoir at temperature tau. Next, from the partition function, they define the thermal energy U as the expectation value of the energy of the system eigenstates. Okay, I'm fine with that: Z and U are functions of the states of the system and of the temperature of the reservoir in which the system is in contact.

After this, I tend to get lost, mainly due to their usage of entropy. When they derive the thermodynamic identity, they start by saying that entropy is a function of U and of the volume V. But what exactly do they mean by "entropy?" When a system is in contact with a reservoir at a certain temperature, the only entropy / degeneracy that can be easily defined is that of the combined system. If I calculate that, I get:

[tex]\sigma = \ln(g(E))[/tex]
[tex]= \ln(\sum{g_S(E_S)g_R(E_R)})[/tex]
[tex]= \ln(\sum{g_S(E_S)e^{\sigma_R(E_{tot})-E_S/\tau}})[/tex]
[tex]= \sigma_R(E_{tot}) + \ln(\sum{g_S(E_S)e^{-E_S/\tau}})[/tex]
[tex]= \sigma_R(E_{tot}) + \ln(Z)[/tex]

The ln(Z) part is obviously correct, but this result doesn't have the U/tau factor that you'd get if you calculated it from the partition function. So, where does the discrepancy come from?
 
Last edited:
Physics news on Phys.org
  • #2
Heh, I used Kittel and Kroemer back in college too. Personally I think you're very generous; I thought it was a horrible book. It's not even that useful as a reference, precisely because equations are never given in a clear context. I'm using this book to study for the thermal part of my PhD qualifier, and it's utterly miserable!

Anyway, I'll check their derivation of the partition function, and see if I can help you.
 
  • #3
Yeah, in thinking about it more, I don't even trust their derivation for the partition function. If you fix the temperature of the reservoir, it would seem to me that you're also fixing the state of the system to be a certain value. The entropy of the reservoir is a function of energy, and so is its derivative with respect to energy, and therefore its temperature. Assuming that the relationship is one-to-one, then the temperature is a function of energy.
 
  • #4
So, does anyone have any insight?
 
  • #5
I don't even know where that derivation is coming from. The typical derivation I've seen along those lines involves a Taylor expansion where you define the entropy as
[tex]S = \ln \left ( \Omega(N,E) \right)[/tex]
where Omega is the number of microstates. You can then break down the energy into the energy of the reservoir and the energy of your microsystem, and Taylor expand on the assumption that the energy of your microsystem is small.

This is one derivation of the partition function, and the physics behind it is that your particular ensemble is "small" compared to the reservoir it's in contact with. Another personal favorite derivation is to maximize the entropy subject to the constraint of fixed energy; I like to do it from the density operator in quantum mechanics. Then temperature drops out as a Lagrange multiplier. The primary problem with that derivation is that it assumes the entropy maximum principle.

The truth is, there is no "rigorous" derivation of the partition function beyond Taylor expansion and saying that one thing is "large" compared to another. Completely rigorous statistical mechanics is done from the microcanonical ensemble, but working with that thing for too long can actually cause brain damage.
 

1. What are statistical variables?

Statistical variables are characteristics or attributes of a population or sample that can be measured or observed. They can be quantitative (numerical) or qualitative (categorical).

2. What is the importance of using statistical variables?

The use of statistical variables allows scientists to summarize and analyze data in a meaningful way. They provide a way to describe the characteristics of a population or sample and make inferences or predictions about larger populations.

3. How are statistical variables classified?

Statistical variables can be classified as either independent or dependent. Independent variables are manipulated or controlled by the scientist, while dependent variables are affected by changes in the independent variables.

4. What are some common statistical variables used in research?

Some common statistical variables used in research include age, gender, income, education level, and test scores. These variables are often used to describe characteristics of a population or to compare groups.

5. How do scientists choose which statistical variables to use in their research?

The choice of statistical variables depends on the research question and the type of data being collected. Scientists must carefully consider which variables are most relevant to their study and will provide meaningful insights into their research topic.

Similar threads

Replies
4
Views
1K
Replies
5
Views
2K
  • Thermodynamics
Replies
3
Views
861
  • Other Physics Topics
Replies
1
Views
1K
Replies
12
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
1K
Replies
3
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Quantum Physics
Replies
9
Views
793
Replies
1
Views
685
Back
Top