Definition of a system in Boltzmann entropy

In summary, the exact definitions of entropy and microstates may vary slightly, with some defining entropy as the entropy of a macrostate and others as the entropy of the entire system. When discussing the increase of a system's entropy, it can refer to the increase of the macrostate with the highest entropy or the increase of the overall system's entropy. Shannon and Gibbs entropies are more consistent in this regard, as they consider the whole system and do not involve macrostates. Boltzmann's approach involves approximations and the assumption of molecular chaos, leading to the H-theorem and the increase of entropy over time.
  • #1
timjdoom
6
3
Context

Boltzmann first defined his entropy as S = k log(W). This seems to be pretty consistently taught. However, the exact definitions of S & W seem to vary slightly.

Some say S is the entropy of a macrostate, while others describe it as the entropy for the system. Where the definition of the system (in my mind) is the collection of all macrostates.

Now admittedly, it could be the definition of both. i.e. if you want the entropy of a macrostate, W is just the count of microstates for that macrostate. Then if you want the entropy of the whole system, W is count of all possible microstates. So it doesn't really matter.

Question

However, it does matter when we say "a systems entropy increases". Does this mean
A) that the macrostate of a system tends toward the macrostate with the highest entropy (& therefore the current entropy the system increases) or;
B) that the entropy of the entire system increases.

Where for a B, for a single closed system the entropy would obviously not change, as it represents the whole of all possibilities for that system. However, it could increase by combining two systems together (S1 & S2) to create a new system (S3) so that S3 >= S1 + S2. e.g. a box of gas + an empty box; OR a box of two different types of ideal gas; when you combine either of those the resulting new system will always have a higher entropy (but is fundamentally a different system).

So when we say "the entropy of a system always increases in the 2nd law" do we mean the entropy of the "meta" system increases (aka in that the probability distribution of microstates becomes more uniform) or the "instance" of that system increases (aka the microstate a system is in tend to the more probable regions)?Further context

Shannon (& Gibbs) entropies seem to be more consistent on this. Since there's no macrostates in their formulations they explicitly Sum or Integrate over all possible microstates. Which means by definition it's for the whole system.

I've heard (from Sean Carroll) that in theory the 2nd law doesn't apply with Gibb's formulation and actually with Gibbs it implies that dS/dt = 0. But experimentally & optically we experience the 2nd law. So I'm now totally confused.

Thank you so much in advance!
 
Science news on Phys.org
  • #2
Have a look at a good textbook on kinetic theory. My all-time favorite is Landau&Lifshitz vol. X.

If you have a complete description of a closed system, there's no change in entropy. For a classical-point-particle system that means to solve the Liouville equation for the complete ##N##-body phase-space distribution function (a function of ##6^N+1## independent arguments for the phase-space coordinates and time).

This is obviously impossible for every-day matter where ##N## is of the order of magnitude of the Avogadro number (i.e., about the number of atoms in 12g of pure carbon-12), i.e., ##\sim 6 \cdot 10^{23}## particles. So you have to reduce the accuracy of your description by "coarse graining", i.e., you through away some information, which is (hopefully) not relevant for the description of the macroscopic observables of interest (which these are is a question of the physical situation and its choice is an art of its own).

The idea behind the Boltzmann equation is to consider only the one-particle distribution function (considering the most simple case of a gas consisting of one kind of molecules only), i.e., you are only interested in how many particles are in a phase-space cell and not in any kind of more complicated particle correlations. That one-particle distribution function is given by integrating the full ##N##-particle distribution function over all except one particle phase-space arguments.

Now you can use the Liouville equation to derive an equation of motion for the one-particle distribution function. If you don't make any further approximations you come to the conclusion that for formulating this equation you need the two-particle distribution function. It's equation of motion needs the three-particle distribution function and so on. So you end up with a tower of ##N_A## coupled equations, which at the end require the solution of the full ##N##-particle distribution function. This is the socalled BBGKY hierarchy (named after Boltzmann, Born, Green, Kirkwood, and Yvon).

So you have to really through away some information to formulate a feasible closed evolution equation for the one-particle distribution function alone, and that's the crucial point of Boltzmann's ingenuity: He cut the BBGKY hierachy at this one-point function level by assuming that the two-particle distribution function can be approximated by the product of the corresponding two one-particle distribution functions, i.e., he assumed that the two-particle correlations are irrelevant and you can just treat the two particles as statistically independent. With this step (known as the "molecular-chaos assumption") you through away more and more information with the time evolution, and this leads to the H-theorem, i.e., the increase of the entropy as well to the Maxwell-Boltzmann distribution as the equilibrium distribution which is determined by the maximum entropy (under the constraints due to the conservation laws).

All you need for this is the unitarity of the S-matrix needed to calculate the transition probability rates (cross sections) in the collision term of the Boltzmann equation, which however is of course an argument using quantum theory, but you can derive the Boltzmann equation as well from quantum theory which has the additional advantage that you can take account of quantum statistics (Bose or Fermi), leading to the Boltzmann-Uehling-Uhlenbeck equation.

If you like to read about it also in the relativistic context, I've a manuscript about that:

https://itp.uni-frankfurt.de/~hees/publ/kolkata.pdf
 
  • Like
Likes timjdoom and etotheipi

1. What is the definition of a system in Boltzmann entropy?

The definition of a system in Boltzmann entropy refers to a collection of particles or molecules that are in thermal equilibrium with each other.

2. How is a system defined in Boltzmann entropy different from other definitions of a system?

The definition of a system in Boltzmann entropy is unique because it takes into account the microscopic behavior of particles and their interactions, rather than just the macroscopic properties of the system.

3. What is the significance of Boltzmann entropy in understanding systems?

Boltzmann entropy is significant because it allows us to quantify the disorder or randomness of a system at the microscopic level, which is crucial in understanding the behavior and properties of a system.

4. How is Boltzmann entropy related to thermodynamics?

Boltzmann entropy is a fundamental concept in thermodynamics, as it helps to explain the relationship between energy and temperature in a system. It also plays a crucial role in the second law of thermodynamics, which states that the total entropy of a closed system can never decrease over time.

5. Can Boltzmann entropy be used to predict the behavior of a system?

Yes, Boltzmann entropy can be used to predict the behavior of a system, as it provides a mathematical framework for understanding the distribution of particles within a system and how they will change over time. This allows us to make predictions about the overall behavior and properties of a system.

Similar threads

  • Thermodynamics
Replies
3
Views
1K
  • Thermodynamics
Replies
18
Views
3K
Replies
22
Views
1K
Replies
13
Views
1K
Replies
2
Views
842
Replies
5
Views
2K
Replies
3
Views
1K
  • Thermodynamics
Replies
1
Views
2K
Replies
1
Views
972
Back
Top