What Are the Differences Between Boltzmann and Gibbs Entropies?

Click For Summary

Discussion Overview

This discussion revolves around the differences between Boltzmann and Gibbs entropies, exploring their definitions, implications, and the contexts in which they apply. Participants raise questions about the measurability of these entropy definitions and their relationships within different statistical ensembles.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant describes Boltzmann entropy for a microcanonical ensemble and Gibbs entropy for a canonical ensemble, questioning the measurability of various entropy definitions.
  • Another participant challenges the connection between Gibbs entropy, total entropy, and microstate entropy, suggesting that Gibbs entropy is a generalization of Boltzmann entropy under specific conditions.
  • Concerns are raised about the definition of temperature in relation to Gibbs entropy, with a participant noting that it cannot be defined for arbitrary probability distributions without specifying how probabilities depend on energy.
  • Some participants discuss the implications of using different ensembles and how Gibbs entropy can represent a more general case than Boltzmann entropy.
  • Clarifications are made regarding the assumptions behind the definitions of microstates and macrostates, with a focus on the probabilities associated with each.
  • One participant expresses confusion about the expression for microstate entropy and its derivation, leading to further elaboration on the relationship between macrostate and microstate probabilities.

Areas of Agreement / Disagreement

Participants express differing views on the relationships between the various entropy definitions, with no consensus reached on the correctness of specific claims or the interpretation of the expressions involved. The discussion remains unresolved regarding the precise connections and definitions of the entropies.

Contextual Notes

Participants note limitations in understanding the derivations and definitions of entropy, particularly regarding the assumptions made about probability distributions and the contexts of different ensembles.

WWCY
Messages
476
Reaction score
15
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks in advance!

Here is what I think I know about entropy:

For a microcanonical ensemble at thermal equilibrium and with fixed energy E, it's entropy is given by ##S_B = k_b \log \Omega##, with ##\Omega## the number of microstates associated with the fixed macrostate with energy E. However, systems are rarely isolated and are usually in contact with some sort of heat bath. If we use the idea of a canonical ensemble (of a system in contact with a heat bath), we are able to calculate the entropy while taking into account the energy fluctuations. The total entropy of the system is then
$$S_{tot} = k_B \log N$$
where N is the total number of accessible microstates. However, this quantity is not measurable as we are unable to measure the entropy associated with the freedom to occupy microstates. This quantity is defined by the following expression
$$S_{micro} = \sum_{i} S_i P_i$$
where ##S_i## is the entropy associated with macrostate ##i## and ##P_i## is the probability that the system is found in the aforementioned macrostate. The actual measurable quantity is the Gibbs Entropy, defined by the total entropy less ##S_{micro}##. This is given by the expression
$$S_{G} = S_{tot} - S_{micro} = - k_B \sum_i P_i \log P_i$$

My questions are:

1. Did I get the description of the various definitions of entropy right?
2. Why are ##S_B##, ##S_{tot}## and ##S_{micro}## considered as unmeasurable quantities? And what does it mean to "measure" entropy?
3. Why, and how are we able to measure the quantity ##S_{G}##?
4. Does ##S_{G}## also tend to maximise itself (with respect to appropriate constraints) in thermal equilibrium?
5. Temperature is defined by ##1/T = \partial _E S_B## in the case of a microcanonical ensemble. Can I say something similar for the Gibbs entropy? ie ##1/T \propto \partial _E S_G##

Many thanks!
 
Physics news on Phys.org
Since nobody else has responded, I will give my two cents. I don't feel that the connection between ##S_G, S_{tot}, and S_{micro}## is quite right.

The Gibbs entropy is I think a generalization of ##S_B##. The latter is the special case where ##p_i = \frac{1}{\Omega}## (all microstates are equally likely).

You can't define a temperature for an arbitrary probability distribution using ##S_G##, because ##S_G## is not necessarily a function of ##E##, unless you explicitly say how the probabilities ##p_i## depend on energy.

I'm not sure where your expression for ##S_{micro}## is coming from.
 
stevendaryl said:
Since nobody else has responded, I will give my two cents. I don't feel that the connection between ##S_G, S_{tot}, and S_{micro}## is quite right.

Do you mind elaborating on what is "not quite right"?

stevendaryl said:
The Gibbs entropy is I think a generalization of ##S_B##

Why would it be considered a generalisation though? Wasn't the ##S_G## derived with a canonical ensemble in mind, while the ##S_B## was derived with the assumption that the system was at fixed ##E##?

As for the expression for ##S_{micro}##, I'll try to get back to you on that the moment I get access to the text I was referencing (Concepts in Thermal Physics, Blundell).

Thank you for your assistance!
 
WWCY said:
Do you mind elaborating on what is "not quite right"?

Just that I don't think it's correct to say that ##S_{tot} = S_{G} - S_{micro}##.

Why would it be considered a generalisation though? Wasn't the ##S_G## derived with a canonical ensemble in mind, while the ##S_B## was derived with the assumption that the system was at fixed ##E##?

The definition ##S_B = k log(\Omega)## is the special case of ##S_G = - k \sum_j P_j\ log(P_j)## when ##P_j = \frac{1}{\Omega}##. That's the probability distribution in which every state with the same energy is considered equally likely.
 
I'm with stevendaryl in not understanding where the expression ##S_{micro}## in the original post comes from. To elaborate a little on the Gibbs formula (the following is mostly my own restatements of stevendaryl's points):

The microcanonical ensemble is just one particular choice of ensemble, and the definition of entropy there, ##S = k_B \log \Omega## (##\Omega## is number of microstates), is unique to that ensemble.

But one is free to consider different ensembles, like the canonical or grand canonical ensemble, and consider the associated entropy there. The usefulness of the Gibbs entropy formula is that it reduces to the correct expression for the entropy in every ensemble. It is a totally general definition of an "entropy" in a probability distribution, and is more fundamental than the expression in a particular ensemble.
 
  • Like
Likes   Reactions: WWCY
Thanks for the responses!

I can see that the Gibbs formula is capable of describing more general situations, but one thing I can't understand is this

stevendaryl said:
The definition ##S_B = k log(\Omega)## is the special case of ##S_G = - k \sum_j P_j\ log(P_j)## when ##P_j = \frac{1}{\Omega}##. That's the probability distribution in which every state with the same energy is considered equally likely.

##P_j## in general describes the probability of the system occupying macrostate J. If, say, we label these macrostates by energies ##E_j## that are not equal, we can't say that the equation ##S_G = - k \sum_j P_j\ log(P_j) = k log(\Omega)## describes the entropy associated with a given macrostate with fixed energy, can we? It seems to me that this only "looks" like the Boltzmann entropy, though it actually describes a system with equally probable macrostates rather than microstates.
 
WWCY said:
##P_j## in general describes the probability of the system occupying macrostate J.

Well, in my comment, I was talking about the probability of the system being in a particular MICROSTATE. The assumption (or maybe it's true by definition) behind the formula ##S = k ln \Omega## is that you fix E, V, N (the total energy, number of particles and volume). Then the assumption is that every microstate with that same E, V, N are equally likely. Then the probability of being in a particular state ##j## is just ##\frac{1}{\Omega}##, where ##\Omega## is the number of states with that given E, V, N. In that case: ##\sum_j p_j ln(\frac{1}{p_j}) = \sum_j \frac{1}{\Omega} ln(\Omega) = ln(\Omega)##

In that analysis, the probability of being in a particular MACROSTATE is 1 (since the macrostate is defined by E,V,N).

I guess you could also talk about the entropy of a situation where there is a probability ##P_j## of being in a particular macrostate ##j##, as well. In that case, the entropy is ... okay, I think I understand where your ##S_{micro}## is coming from! Sorry.

Let ##P(j)## be the probability of being in macrostate ##j##. Let ##P(\mu | j)## be the conditional probability of being in microstate ##\mu##, given that the system is in macrostate ##j##. Then the probability of being in macrostate ##j## and microstate ##\mu## is given by:

##P(j) P(\mu | j)##

The associated entropy is given by:

##S = k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(j) P(\mu | j)}) ##
##= k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(j)}) + k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(\mu | j)})##

(where I've used the property of logarithms that ##log(\frac{1}{XY}) = log(\frac{1}{X}) + log(\frac{1}{Y})##)

For the first term on the right side of the equals, the sum over ##\mu## can be done to give 1. (Since ##\sum_\mu P(\mu | j) = 1##).
For the second term on the right side, we can write ##S_j = k \sum_\mu P(\mu | j) log(\frac{1}{P(\mu | j)})##. So the expression simplifies to:

##S = k \sum_j P(j) log(\frac{1}{P(j)}) + \sum_j P(j) S_j##
## = S_{macro} + S_{micro}##

So I now understand your original post. Sorry for taking so long.

But this is not ##S_G = S_{macro} + S_{micro}##. All three are using the formula ##S = k \sum_j P_j log(\frac{1}{P_j})##, but for different probabilities.

If, say, we label these macrostates by energies ##E_j## that are not equal, we can't say that the equation ##S_G = - k \sum_j P_j\ log(P_j) = k log(\Omega)## describes the entropy associated with a given macrostate with fixed energy, can we? It seems to me that this only "looks" like the Boltzmann entropy, though it actually describes a system with equally probable macrostates rather than microstates.

I think I was confused by the micro versus macro language. If you have a system at constant temperature, then I would say that in that case, the macrostate is determined by the temperature, rather than by the energy.

But in a certain sense, you have two levels of micro- versus macro- going on. There is one level where you specify T and let the energy be uncertain. There is another level where you specify E and let the microstate be uncertain.
 
  • Like
Likes   Reactions: WWCY

Similar threads

  • · Replies 18 ·
Replies
18
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 29 ·
Replies
29
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K