What Are the Differences Between Boltzmann and Gibbs Entropies?

Click For Summary
SUMMARY

This discussion clarifies the distinctions between Boltzmann entropy (S_B) and Gibbs entropy (S_G), emphasizing that S_B is applicable in a microcanonical ensemble with fixed energy, while S_G generalizes this concept for systems in thermal contact with a heat bath. The total entropy (S_tot) is defined as S_tot = k_B log(N), where N represents the number of accessible microstates. The Gibbs entropy is measurable and is expressed as S_G = -k_B ∑ P_i log(P_i), where P_i is the probability of the system occupying a specific macrostate. The conversation also addresses misconceptions regarding the relationship between these entropy definitions and their applicability in different ensembles.

PREREQUISITES
  • Understanding of microcanonical and canonical ensembles in statistical mechanics.
  • Familiarity with the definitions of Boltzmann entropy (S_B) and Gibbs entropy (S_G).
  • Knowledge of probability distributions and their application in thermodynamic systems.
  • Basic grasp of thermodynamic concepts such as temperature and energy fluctuations.
NEXT STEPS
  • Study the derivation and implications of Boltzmann entropy (S_B) in microcanonical ensembles.
  • Explore Gibbs entropy (S_G) and its applications in canonical ensembles.
  • Investigate the relationship between entropy and temperature in various thermodynamic systems.
  • Examine the role of probability distributions in defining macrostates and microstates in statistical mechanics.
USEFUL FOR

Students and professionals in physics, particularly those specializing in statistical mechanics, thermodynamics, and entropy analysis. This discussion is beneficial for anyone seeking to deepen their understanding of entropy definitions and their applications in different thermodynamic contexts.

WWCY
Messages
476
Reaction score
15
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks in advance!

Here is what I think I know about entropy:

For a microcanonical ensemble at thermal equilibrium and with fixed energy E, it's entropy is given by ##S_B = k_b \log \Omega##, with ##\Omega## the number of microstates associated with the fixed macrostate with energy E. However, systems are rarely isolated and are usually in contact with some sort of heat bath. If we use the idea of a canonical ensemble (of a system in contact with a heat bath), we are able to calculate the entropy while taking into account the energy fluctuations. The total entropy of the system is then
$$S_{tot} = k_B \log N$$
where N is the total number of accessible microstates. However, this quantity is not measurable as we are unable to measure the entropy associated with the freedom to occupy microstates. This quantity is defined by the following expression
$$S_{micro} = \sum_{i} S_i P_i$$
where ##S_i## is the entropy associated with macrostate ##i## and ##P_i## is the probability that the system is found in the aforementioned macrostate. The actual measurable quantity is the Gibbs Entropy, defined by the total entropy less ##S_{micro}##. This is given by the expression
$$S_{G} = S_{tot} - S_{micro} = - k_B \sum_i P_i \log P_i$$

My questions are:

1. Did I get the description of the various definitions of entropy right?
2. Why are ##S_B##, ##S_{tot}## and ##S_{micro}## considered as unmeasurable quantities? And what does it mean to "measure" entropy?
3. Why, and how are we able to measure the quantity ##S_{G}##?
4. Does ##S_{G}## also tend to maximise itself (with respect to appropriate constraints) in thermal equilibrium?
5. Temperature is defined by ##1/T = \partial _E S_B## in the case of a microcanonical ensemble. Can I say something similar for the Gibbs entropy? ie ##1/T \propto \partial _E S_G##

Many thanks!
 
Physics news on Phys.org
Since nobody else has responded, I will give my two cents. I don't feel that the connection between ##S_G, S_{tot}, and S_{micro}## is quite right.

The Gibbs entropy is I think a generalization of ##S_B##. The latter is the special case where ##p_i = \frac{1}{\Omega}## (all microstates are equally likely).

You can't define a temperature for an arbitrary probability distribution using ##S_G##, because ##S_G## is not necessarily a function of ##E##, unless you explicitly say how the probabilities ##p_i## depend on energy.

I'm not sure where your expression for ##S_{micro}## is coming from.
 
stevendaryl said:
Since nobody else has responded, I will give my two cents. I don't feel that the connection between ##S_G, S_{tot}, and S_{micro}## is quite right.

Do you mind elaborating on what is "not quite right"?

stevendaryl said:
The Gibbs entropy is I think a generalization of ##S_B##

Why would it be considered a generalisation though? Wasn't the ##S_G## derived with a canonical ensemble in mind, while the ##S_B## was derived with the assumption that the system was at fixed ##E##?

As for the expression for ##S_{micro}##, I'll try to get back to you on that the moment I get access to the text I was referencing (Concepts in Thermal Physics, Blundell).

Thank you for your assistance!
 
WWCY said:
Do you mind elaborating on what is "not quite right"?

Just that I don't think it's correct to say that ##S_{tot} = S_{G} - S_{micro}##.

Why would it be considered a generalisation though? Wasn't the ##S_G## derived with a canonical ensemble in mind, while the ##S_B## was derived with the assumption that the system was at fixed ##E##?

The definition ##S_B = k log(\Omega)## is the special case of ##S_G = - k \sum_j P_j\ log(P_j)## when ##P_j = \frac{1}{\Omega}##. That's the probability distribution in which every state with the same energy is considered equally likely.
 
I'm with stevendaryl in not understanding where the expression ##S_{micro}## in the original post comes from. To elaborate a little on the Gibbs formula (the following is mostly my own restatements of stevendaryl's points):

The microcanonical ensemble is just one particular choice of ensemble, and the definition of entropy there, ##S = k_B \log \Omega## (##\Omega## is number of microstates), is unique to that ensemble.

But one is free to consider different ensembles, like the canonical or grand canonical ensemble, and consider the associated entropy there. The usefulness of the Gibbs entropy formula is that it reduces to the correct expression for the entropy in every ensemble. It is a totally general definition of an "entropy" in a probability distribution, and is more fundamental than the expression in a particular ensemble.
 
  • Like
Likes   Reactions: WWCY
Thanks for the responses!

I can see that the Gibbs formula is capable of describing more general situations, but one thing I can't understand is this

stevendaryl said:
The definition ##S_B = k log(\Omega)## is the special case of ##S_G = - k \sum_j P_j\ log(P_j)## when ##P_j = \frac{1}{\Omega}##. That's the probability distribution in which every state with the same energy is considered equally likely.

##P_j## in general describes the probability of the system occupying macrostate J. If, say, we label these macrostates by energies ##E_j## that are not equal, we can't say that the equation ##S_G = - k \sum_j P_j\ log(P_j) = k log(\Omega)## describes the entropy associated with a given macrostate with fixed energy, can we? It seems to me that this only "looks" like the Boltzmann entropy, though it actually describes a system with equally probable macrostates rather than microstates.
 
WWCY said:
##P_j## in general describes the probability of the system occupying macrostate J.

Well, in my comment, I was talking about the probability of the system being in a particular MICROSTATE. The assumption (or maybe it's true by definition) behind the formula ##S = k ln \Omega## is that you fix E, V, N (the total energy, number of particles and volume). Then the assumption is that every microstate with that same E, V, N are equally likely. Then the probability of being in a particular state ##j## is just ##\frac{1}{\Omega}##, where ##\Omega## is the number of states with that given E, V, N. In that case: ##\sum_j p_j ln(\frac{1}{p_j}) = \sum_j \frac{1}{\Omega} ln(\Omega) = ln(\Omega)##

In that analysis, the probability of being in a particular MACROSTATE is 1 (since the macrostate is defined by E,V,N).

I guess you could also talk about the entropy of a situation where there is a probability ##P_j## of being in a particular macrostate ##j##, as well. In that case, the entropy is ... okay, I think I understand where your ##S_{micro}## is coming from! Sorry.

Let ##P(j)## be the probability of being in macrostate ##j##. Let ##P(\mu | j)## be the conditional probability of being in microstate ##\mu##, given that the system is in macrostate ##j##. Then the probability of being in macrostate ##j## and microstate ##\mu## is given by:

##P(j) P(\mu | j)##

The associated entropy is given by:

##S = k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(j) P(\mu | j)}) ##
##= k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(j)}) + k \sum_j \sum_\mu P(j) P(\mu | j) log(\frac{1}{P(\mu | j)})##

(where I've used the property of logarithms that ##log(\frac{1}{XY}) = log(\frac{1}{X}) + log(\frac{1}{Y})##)

For the first term on the right side of the equals, the sum over ##\mu## can be done to give 1. (Since ##\sum_\mu P(\mu | j) = 1##).
For the second term on the right side, we can write ##S_j = k \sum_\mu P(\mu | j) log(\frac{1}{P(\mu | j)})##. So the expression simplifies to:

##S = k \sum_j P(j) log(\frac{1}{P(j)}) + \sum_j P(j) S_j##
## = S_{macro} + S_{micro}##

So I now understand your original post. Sorry for taking so long.

But this is not ##S_G = S_{macro} + S_{micro}##. All three are using the formula ##S = k \sum_j P_j log(\frac{1}{P_j})##, but for different probabilities.

If, say, we label these macrostates by energies ##E_j## that are not equal, we can't say that the equation ##S_G = - k \sum_j P_j\ log(P_j) = k log(\Omega)## describes the entropy associated with a given macrostate with fixed energy, can we? It seems to me that this only "looks" like the Boltzmann entropy, though it actually describes a system with equally probable macrostates rather than microstates.

I think I was confused by the micro versus macro language. If you have a system at constant temperature, then I would say that in that case, the macrostate is determined by the temperature, rather than by the energy.

But in a certain sense, you have two levels of micro- versus macro- going on. There is one level where you specify T and let the energy be uncertain. There is another level where you specify E and let the microstate be uncertain.
 
  • Like
Likes   Reactions: WWCY

Similar threads

  • · Replies 18 ·
Replies
18
Views
6K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
6K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K