Definition of a system in Boltzmann entropy

AI Thread Summary
The discussion centers on the definitions of entropy (S) and the number of microstates (W) in the context of Boltzmann's entropy formula, S = k log(W). There is ambiguity regarding whether S refers to the entropy of a macrostate or the entire system, with implications for understanding the second law of thermodynamics. The key question is whether an increase in a system's entropy refers to the macrostate moving toward higher entropy or the overall entropy of a closed system. The conversation also touches on the limitations of classical descriptions of entropy and the necessity of approximations, such as coarse graining, in practical applications. Ultimately, the complexities of entropy definitions and their implications for thermodynamic laws lead to confusion, particularly when comparing Boltzmann's approach with that of Shannon and Gibbs.
timjdoom
Messages
6
Reaction score
3
Context

Boltzmann first defined his entropy as S = k log(W). This seems to be pretty consistently taught. However, the exact definitions of S & W seem to vary slightly.

Some say S is the entropy of a macrostate, while others describe it as the entropy for the system. Where the definition of the system (in my mind) is the collection of all macrostates.

Now admittedly, it could be the definition of both. i.e. if you want the entropy of a macrostate, W is just the count of microstates for that macrostate. Then if you want the entropy of the whole system, W is count of all possible microstates. So it doesn't really matter.

Question

However, it does matter when we say "a systems entropy increases". Does this mean
A) that the macrostate of a system tends toward the macrostate with the highest entropy (& therefore the current entropy the system increases) or;
B) that the entropy of the entire system increases.

Where for a B, for a single closed system the entropy would obviously not change, as it represents the whole of all possibilities for that system. However, it could increase by combining two systems together (S1 & S2) to create a new system (S3) so that S3 >= S1 + S2. e.g. a box of gas + an empty box; OR a box of two different types of ideal gas; when you combine either of those the resulting new system will always have a higher entropy (but is fundamentally a different system).

So when we say "the entropy of a system always increases in the 2nd law" do we mean the entropy of the "meta" system increases (aka in that the probability distribution of microstates becomes more uniform) or the "instance" of that system increases (aka the microstate a system is in tend to the more probable regions)?Further context

Shannon (& Gibbs) entropies seem to be more consistent on this. Since there's no macrostates in their formulations they explicitly Sum or Integrate over all possible microstates. Which means by definition it's for the whole system.

I've heard (from Sean Carroll) that in theory the 2nd law doesn't apply with Gibb's formulation and actually with Gibbs it implies that dS/dt = 0. But experimentally & optically we experience the 2nd law. So I'm now totally confused.

Thank you so much in advance!
 
Science news on Phys.org
Have a look at a good textbook on kinetic theory. My all-time favorite is Landau&Lifshitz vol. X.

If you have a complete description of a closed system, there's no change in entropy. For a classical-point-particle system that means to solve the Liouville equation for the complete ##N##-body phase-space distribution function (a function of ##6^N+1## independent arguments for the phase-space coordinates and time).

This is obviously impossible for every-day matter where ##N## is of the order of magnitude of the Avogadro number (i.e., about the number of atoms in 12g of pure carbon-12), i.e., ##\sim 6 \cdot 10^{23}## particles. So you have to reduce the accuracy of your description by "coarse graining", i.e., you through away some information, which is (hopefully) not relevant for the description of the macroscopic observables of interest (which these are is a question of the physical situation and its choice is an art of its own).

The idea behind the Boltzmann equation is to consider only the one-particle distribution function (considering the most simple case of a gas consisting of one kind of molecules only), i.e., you are only interested in how many particles are in a phase-space cell and not in any kind of more complicated particle correlations. That one-particle distribution function is given by integrating the full ##N##-particle distribution function over all except one particle phase-space arguments.

Now you can use the Liouville equation to derive an equation of motion for the one-particle distribution function. If you don't make any further approximations you come to the conclusion that for formulating this equation you need the two-particle distribution function. It's equation of motion needs the three-particle distribution function and so on. So you end up with a tower of ##N_A## coupled equations, which at the end require the solution of the full ##N##-particle distribution function. This is the socalled BBGKY hierarchy (named after Boltzmann, Born, Green, Kirkwood, and Yvon).

So you have to really through away some information to formulate a feasible closed evolution equation for the one-particle distribution function alone, and that's the crucial point of Boltzmann's ingenuity: He cut the BBGKY hierachy at this one-point function level by assuming that the two-particle distribution function can be approximated by the product of the corresponding two one-particle distribution functions, i.e., he assumed that the two-particle correlations are irrelevant and you can just treat the two particles as statistically independent. With this step (known as the "molecular-chaos assumption") you through away more and more information with the time evolution, and this leads to the H-theorem, i.e., the increase of the entropy as well to the Maxwell-Boltzmann distribution as the equilibrium distribution which is determined by the maximum entropy (under the constraints due to the conservation laws).

All you need for this is the unitarity of the S-matrix needed to calculate the transition probability rates (cross sections) in the collision term of the Boltzmann equation, which however is of course an argument using quantum theory, but you can derive the Boltzmann equation as well from quantum theory which has the additional advantage that you can take account of quantum statistics (Bose or Fermi), leading to the Boltzmann-Uehling-Uhlenbeck equation.

If you like to read about it also in the relativistic context, I've a manuscript about that:

https://itp.uni-frankfurt.de/~hees/publ/kolkata.pdf
 
  • Like
Likes timjdoom and etotheipi
I need to calculate the amount of water condensed from a DX cooling coil per hour given the size of the expansion coil (the total condensing surface area), the incoming air temperature, the amount of air flow from the fan, the BTU capacity of the compressor and the incoming air humidity. There are lots of condenser calculators around but they all need the air flow and incoming and outgoing humidity and then give a total volume of condensed water but I need more than that. The size of the...
Thread 'Why work is PdV and not (P+dP)dV in an isothermal process?'
Let's say we have a cylinder of volume V1 with a frictionless movable piston and some gas trapped inside with pressure P1 and temperature T1. On top of the piston lay some small pebbles that add weight and essentially create the pressure P1. Also the system is inside a reservoir of water that keeps its temperature constant at T1. The system is in equilibrium at V1, P1, T1. Now let's say i put another very small pebble on top of the piston (0,00001kg) and after some seconds the system...
Back
Top