statmech1

Learn Statistical Mechanics: Equilibrium Systems

Estimated Read Time: 7 minute(s)
Common Topics: entropy, state, distribution, function, states

This is the first of a multi-part series of articles intended to give a concise overview of statistical mechanics and some of its applications. These articles are by no means comprehensive of the entire field, but aim to give a clear line of reasoning from Boltzmann’s equation to non-equilibrium statistical mechanics. It is hoped that these articles will help the reader to better understand the framework of statistical mechanics as well as the concepts of entropy and the second law, which are notoriously slippery concepts to fully grasp, even for practicing physicists.

Statistical mechanics aims to predict the macroscopic behavior of many-body systems by studying the configurations and interactions of the individual bodies. Statistical mechanics is often taught as an extension of thermodynamics. This is helpful since it can give a clearer understanding of thermodynamic quantities such as temperature and entropy. However, the application of statistical mechanics is not constrained only to thermodynamic problems. Indeed, it has been successfully applied to many problems outside of pure physics: a few examples include the behavior of stock markets, the spread of disease, information processing in neural nets, and self segregation in neighborhoods.

Gibbs Entropy and the Boltzmann Equation

To begin, consider a collection of subsystems simultaneously occupying the phase space ##\Omega_{E}## where ##\Omega_{E}## is a subspace with constant energy ##E## of the total phase space. The nature of the subsystems is left unspecified for the sake of generality. Now consider that the positions and momenta are discretized such that ##\Omega_{E}## is partitioned into ##l## discrete cells labelled ##\omega_{1},\omega_{2},\cdots\omega_{l}## as shown below. Each point ##P_{i}## represents the particular state of a subsystem in phase space.

Let ##\omega_{i}## represent the number of sub-systems consistent with the ##i##th thermodynamic state. The number of subsystems in all the accessible states must sum to the total number of subsystems as

$$\sum_{i=1}^{l}\omega_{i}=\Omega_{E}$$

The distribution ##D=\{\omega_{1},\omega_{2},\cdots\omega_{l}\}## describes the state of the whole system. The number of arrangements ##W(D)## consistent with a given distribution is

$$W(D)=\frac{\Omega_{E}!}{\omega_{1}!\omega_{2}!\cdots\omega_{l}!}$$

The claim crucial to the formulation of statistical mechanics is that the entropy ##S## of a system and the quantity ##W## are related. Low entropy states correspond to small ##W## while high entropy states correspond to large ##W##. Thus, high entropy states are much more numerous than low entropy states. Since the particle dynamics are considered stochastic, states are selected at random. This naturally leads to a preference for high entropy states over low entropy states. In the thermodynamic limit (as ##N## and ##V## goes to infinity with ##\rho## constant), the number of high entropy states grows continuously larger than the number of low entropy states. This forces the system to exist in the maximum entropy state when at equilibrium.

The function which maps ##W## to ##S## must meet certain criteria for ##S##.

  • ##S## is a continuous function of ##W##
  • ##S## is a monotonically increasing function of ##W##: If ##W_{1}>W_{2}## then ##S(W_{1})>S(W_{2})##
  • ##S## is additive: ##S(W_{1}W_{2})=S(W_{1})+S(W_{2})##

The first two criteria follow from the above arguments to express ##S## in terms of ##W##. The third criteria is required to make the entropy an extensive quantity. When two separate subsystems are combined, the number of arrangements of the combined system is the product of the number of arrangements of each subsystem and the entropy of the combined system is the sum of the entropy of each subsystem.

The simplest function satisfying these criteria, as postulated by Boltzmann, is proportional to a logarithmic function.

$$S=k\ln W=k\ln \frac{\Omega_{E}!}{\omega_{1}!\omega_{2}!\cdots\omega_{l}!}$$

Using Sterling’s approximation, this can be written as

$$S=k\left[\Omega_{E}\ln \Omega_{E} -\Omega_{E} -\sum_{i=1}^{l}\omega_{i}\ln \omega_{i}+\sum_{i=1}^{l} \omega_{i}\right]$$

Defining ##p_{i}=\omega_{i}/\Omega_{E}## as the probability to find a subsystem in a particular state ##\omega_{i}##, and remembering that ##\sum_{i=1}^{l}p_{i}=1##, gives the Gibbs entropy formula

$$S=-k\sum_{i=1}^{l}p_{i}\ln p_{i}$$

Notice that ##S## takes on a minimum value for the delta function distribution ##p_{i}=\delta_{i,j}## and a maximum value for the uniform distribution ##p_{i}=1/l##. Thus ##S## is a measure of the dispersity of the distribution ##p_{i}##. I am reluctant to use the term disorder here since such wording is a source of much confusion. A system which looks disordered is not necessarily a high entropy state: examples include crystallization of supersaturated solutions and liquid crystals. Disorder is furthermore a qualitative term and thus subject to the opinion of the observer.

The maximum entropy probability distribution has special importance here. When a system is initially in a non-equilibrium state, it will evolve in time until it reaches an equilibrium state (assuming one exists), where it will remain forever. The second law of thermodynamics states that the entropy of a closed system tends to a maximum. Thus, the equilibrium state is also a maximum entropy state.

Consider an isolated system with ##\Omega## accessible configurations. At equilibrium, the probability to be in a given state is expected to be constant in time. The system may fluctuate between different microstates but these microstates must be consistent with a fixed macroscopic variables, i.e. pressure, volume, energy, etc. There is nothing which would lead us to believe that one microstate is favored over another. Thus we invoke the principle of equal a priori probabilities such that ##p_{i}=1/\Omega##, which is the special case of a uniform probability distribution. This gives the Boltzmann entropy formula:

$$S=-k\sum_{i=1}^{\Omega}\frac{1}{\Omega}\ln\frac{1}{\Omega}=k\ln \Omega$$

The Boltzmann formula is therefore consistent with the maximum entropy state and is a valid description of the entropy of a system at thermodynamic equilibrium.

The Canonical Ensemble

To describe the equilibrium state using the Canonical ensemble (##N##, ##V##, & ##T## constant), we search for the distribution ##p_{i}## which maximizes the Gibbs entropy under the constraints that the probabilities are normalized

$$\sum_{i}p_{i}=1$$

and the average energy is fixed

$$\left<E\right>=\sum_{i}p_{i}E_{i}=U$$

The probability distribution which maximizes the entropy is found using the method of Lagrange multipliers.

$$\mathcal{L}=\left(-k\sum_{i}p_{i}\ln p_{i}\right)+\lambda_{1}\left(\sum_{i}p_{i}-1\right)+\lambda_{2}\left(\sum_{i}p_{i}E_{i}-U\right)$$

$$0=\frac{\partial\mathcal{L}}{\partial p_{j}}=-k-k\ln p_{j}+\lambda_{1}+\lambda_{2}E_{j}$$

$$p_{j}=\exp\left(\frac{-k+\lambda_{1}+\lambda_{2}E_{j}}{k}\right)$$

##\lambda_{1}## is determined by the normalization condition on ##p_{j}##:

$$1=\sum_{j}p_{j}=\exp\left(\frac{-k+\lambda_{1}}{k}\right)\sum_{j}\exp\left(\frac{\lambda_{2}E_{j}}{k}\right)=\exp\left(\frac{-k+\lambda_{1}}{k}\right)Z$$

where ##Z## is defined as

$$Z=\sum_{j}\exp\left(\frac{\lambda_{2}E_{j}}{k}\right)$$

This sets ##\lambda_{1}## as

$$\exp\left(\frac{-k+\lambda_{1}}{k}\right)=\frac{1}{Z}$$

while the probability distribution becomes

$$p_{j}=\frac{1}{Z}\exp\left(\frac{\lambda_{2}E_{j}}{k}\right)$$

To determine ##\lambda_{2}##, we first rewrite ##S## in terms of ##Z## as

$$S=-k\sum_{j}p_{j}\ln p_{j}=-k\sum_{j}p_{j}\left(\frac{\lambda_{2}E_{j}}{k}-\ln Z\right)=-\lambda_{2}U+k\ln Z$$

and use the thermodynamic definition of temperature to find

$$\frac{1}{T}\equiv\frac{\partial S}{\partial U}=-\lambda_{2}$$

The canonical partition function ##Z## is thus

$$Z\equiv\sum_{i}\exp\left(-\beta E_{i}\right)$$

where ##\beta\equiv 1/kT##. The probability distribution and the entropy at equilibrium are

$$p_{i}=\frac{1}{Z}\exp\left(-\beta E_{i}\right)$$

$$S=\frac{U}{T}+k\ln Z$$

The Partition Function and Thermodynamic Variables

With the above definition of the partition function and probability distribution, the relationships between ##Z## and other thermodynamic quantities can now be found. The average energy is

$$\left<E\right>=\sum_{i}p_{i}E_{i}=\frac{1}{Z}\sum_{i}\exp\left(-\beta E_{i}\right)E_{i}=-\frac{1}{Z}\frac{\partial Z}{\partial\beta}=-\frac{\partial}{\partial\beta}\ln Z$$

The heat capacity is

$$C_{v}=\frac{\partial\left<E\right>}{\partial T}=-k\beta^{2}\frac{\partial\left<E\right>}{\partial\beta}=k\beta^{2}\frac{\partial^{2}}{\partial\beta^{2}}\ln Z$$

The Helmholtz free energy is

$$F=U-TS=-kT\ln Z$$

which gives a more direct representation of the entropy as

$$S=-\frac{\partial F}{\partial T}=k\frac{\partial}{\partial T}T\ln Z$$

Clearly, many macroscopic quantities of interest can be found if the partition function is known. It is often said that if the partition function is known, then everything is known about the system. Thus the goal when studying the behavior of equilibrium systems is to find the partition function ##Z##, from which all other useful information can be obtained.

Read Part 2: The Ideal Gas

 

 

20 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply