Boltzmann vs. Gibbs entropy, negative energy

Click For Summary
SUMMARY

The discussion centers on the comparison between Boltzmann and Gibbs entropy, highlighting the inadequacies of Boltzmann entropy in certain contexts. Participants assert that Gibbs entropy, particularly its quantum version via von Neumann entropy, provides a more comprehensive framework for understanding microcanonical ensembles. The conversation references specific equations from a preprint (arXiv:1304.2066) and critiques the authors' definitions of entropy, emphasizing the need for clarity in distinguishing between Gibbs and Boltzmann entropies. The consensus is that while Boltzmann entropy is commonly used, Gibbs entropy should be preferred for small systems or those with bounded Hamiltonians.

PREREQUISITES
  • Understanding of statistical mechanics principles, particularly microcanonical and canonical ensembles.
  • Familiarity with entropy definitions, including Boltzmann and Gibbs entropy.
  • Knowledge of quantum mechanics, specifically the von Neumann entropy and density operators.
  • Ability to interpret mathematical expressions related to statistical mechanics, such as density matrices and trace operations.
NEXT STEPS
  • Research the differences between Boltzmann and Gibbs entropy in statistical mechanics.
  • Learn about the von Neumann entropy and its applications in quantum statistical mechanics.
  • Explore the implications of using Gibbs entropy for small systems and bounded Hamiltonians.
  • Examine the role of the maximum-entropy principle in statistical mechanics and its derivation from kinetic theory.
USEFUL FOR

Physicists, particularly those specializing in statistical mechanics and quantum mechanics, as well as researchers interested in the foundational aspects of entropy and its applications in various physical systems.

  • #31
This call for universal redefinition of entropy of a closed system seems very strange to me too. For a typical system of thermodynamics enclosed in a box, the area of available phase space hypersurface will grow with energy (think of ideal gas). The old definition of entropy for most many-particle systems seems consistent with classical thermodynamics and is well motivated in probabilistic ideas (accessible states...)

For a system of states whose density of states decreases with energy, this standard definition gives entropy that does not grow with energy. For the authors this is somehow sign of a defect of the concept and they propose another definition of statistical entropy ##S_{new} (E)## that maintains monotonicity, and so they say is generally better.

But there is no theoretical justification for such requirement when the systems have decreasing density of states. It is not clear that such systems fall under the purview of classical thermodynamics at all. If not, it makes no sense to impose requirement of thermodynamic properties upon statistical entropy and it is better to stay with the old definition for its other well-known nice properties.

For example, take multi-level system ##\mu## with decreasing density of states ##D(E)## and highest energy ##E_h##.

The standard statistical entropy ##S(E)## decreases with energy for this system, and when the energy is ##E_h##, it is minimum. Now let us connect ##\mu## thermally to a large system ##B## with ordinary density of states (increasing with energy) and Boltzmann temperature ##T##. Since in equilibrium the total system will have such occupation probabilities as to maximize total multiplicity for total energy of the sum system, the ##\mu## system will most probably transfer some heat to ##B##, and itself fall down to lower average energy (and higher standard entropy).

In other words, the system ##\mu## behaves as universal source of heat, regardless of the temperature ##T## of the receptor. Such systems do not fall under classical thermodynamics. They do not have thermodynamic temperature (in the original sense).

It may be interesting to study such systems with methods of statistical physics, and even define some new statistical entropy concept and new temperature concept for them for convenience. It would not be the first case (consider how many entropies are there already). Perhaps it is even possible to use thermodynamic ##formalism## for closed collection of such systems.

But it seems doubtful that this necessitates change of the standard statistical entropy for systems it was originally designed for.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
Replies
2
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
6K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 2 ·
Replies
2
Views
8K