Boltzmann Entropy Formula – Derivation

In summary: If W is the number of microstates for a given macrostate then there is a factor...Yes, that's definitely one of the things I was wondering about.In summary, the Boltzmann entropy is a measure of the probability of a macrostate.
  • #1
Dario56
289
44
Boltzmann entropy definition is given by: $$ S = k_B lnW $$ where ##W## is the weight of the configuration which has the maximum number of microstates.

This equation is used everywhere in statistical thermodynamics and I saw it in the derivation of Gibbs entropy. However, I can't find the derivation of this equation or where does it come from. Do you know a source?
 
  • Like
Likes Delta2
Science news on Phys.org
  • #2
Your definition is not exact. In statistical mechanics, one defines the Boltzmann entropy as ##S=k_BlnW##, where ##W## is the number of microstates compatible with some macroscopic parameters ##(E,V,N)##.
 
  • Like
Likes dextercioby
  • #3
Lord Jestocost said:
Your definition is not exact. In statistical mechanics, one defines the Boltzmann entropy as ##S=k_BlnW##, where ##W## is the number of microstates compatible with some macroscopic parameters ##(E,V,N)##.
You know maybe where can derivation of this equation be found? I can't find it anywhere.
 
  • #5
Lord Jestocost said:
On http://www.energyandentropy.com/page/index.html, Harvey S. Leff shortly sketches the thinking behind the definition of Boltzmann’s entropy.
So it is actually a definition, it cannot be proved by other principles/theorems?
 
  • Like
Likes Philip Koeck
  • #6
Delta2 said:
So it is actually a definition, it cannot be proved by other principles/theorems?
I would also like to find out more about that.
It seems clear that entropy is proportional to some probability or weight W, but what exactly does W stand for and why?
 
  • Like
Likes Delta2
  • #7
I believe Boltzmann was thinking of probability when he chose the letter W (Wahrscheinlichkeit), but clearly it's not a number between 0 and 1.
It's more like something proportional to the probability of a macro-state.
As far as I know it's simple for indistinguishable particles: W is just the number of ways a macro state can be realized. Entropy defined like that becomes an extensive quantity.

Robert Swendsen, mentioned in the article shared by Lord Jestocost, also wants an extensive definition of statistical entropy for distinguishable particles. He defines entropy as S = kB ln ( W f(N) ) in order to make it extensive.
For example f(N) could be 1/N! Other researchers do the same.

I believe the only justification is that statistical entropy has the same properties as thermodynamic entropy. I don't know of any derivation from something else.
 
  • Like
Likes Dario56
  • #8
I think the most intuitive definition of entropy is via information theory based on Shannon's definition of the measure of information. At least for me it was a revelation, when I attended a special lecture about statistical physics from an information-theoretical point of view. A good testbook is

A. Katz, Principles of Statistical Mechanics, W. H. Freeman and Company, San Francisco and London (1967).

The point is that thermodynamics can be derived from the underlying microscopic physics using statistical methods. It may sound a bit strange, but one should be aware that from a conceptional point of view quantum statistical physics is simpler than classical statistical physics. The latter can be easily derived from the former as an appropriate approximation for situations, where this approximation is justified (i.e., occupation numbers for each microstate that are much smaller than 1 on average; e.g., for large temperatures and small densities in the equilibrium case).
 
  • Like
  • Informative
Likes dextercioby, Philip Koeck, hutchphd and 1 other person
  • #9
vanhees71 said:
... quantum statistical physics is simpler than classical statistical physics. The latter can be easily derived from the former as an appropriate approximation for situations, where this approximation is justified (i.e., occupation numbers for each microstate that are much smaller than 1 on average; e.g., for large temperatures and small densities in the equilibrium case).
Do you mean "... occupation numbers for each quantum state (or single particle state) that are much smaller than 1 on average ..."?

I'm not sure I can see how low occupation turns quantum particles into classical ones.
For example photons in a cavity should remain indistinguishable and non-classical even at low occupancy, or not?

If we apply the low occupancy limit to the expression for W we don't get the same result for distinguishable and indistinguishable particles either. The results differ by a factor N!.
(See https://www.researchgate.net/publication/330675047_An_introduction_to_the_statistical_physics_of_distinguishable_and_indistinguishable_particles_in_an_ideal_gas)
 
  • #10
Take an ideal For bosons (upper sign) and fermions (lower sign) you have
$$f=\frac{1}{\exp[\vec{p}^2/(2mT)-\mu/T] \mp 1}.$$
If the exponential is much larger than ##1##, i.e., if ##\mu<0## with ##|\mu|/T \gg 1## (dilute limit) you can neglect the ##\pm 1## and get the classical Maxwell-Boltzmann distribution.

The ##1/N!## is a feature, not a bug. We had this discussion already some time ago in this forum!
 
  • #11
vanhees71 said:
Take an ideal For bosons (upper sign) and fermions (lower sign) you have
$$f=\frac{1}{\exp[\vec{p}^2/(2mT)-\mu/T] \mp 1}.$$
If the exponential is much larger than ##1##, i.e., if ##\mu<0## with ##|\mu|/T \gg 1## (dilute limit) you can neglect the ##\pm 1## and get the classical Maxwell-Boltzmann distribution.

The ##1/N!## is a feature, not a bug. We had this discussion already some time ago in this forum!
No problem with that.
The original question was how one can motivate S = k ln(W) and I would say a major part of this question is what W stands for.
If W is the number of microstates for a given macrostate then there is a factor N! difference between distinguishable and indistinguishable particles in the low occupancy limit.
This means that S becomes non-extensive for distinguishable particles.

I can see several possible solutions for this, for example:
There are no distinguishable particles.
W is not the number of microstates.
The Boltzmann-formula has to be modified.
Entropy doesn't need to be extensive.
 
Last edited:
  • #12
Delta2 said:
So it is actually a definition, it cannot be proved by other principles/theorems?
Well, yes and no. To derive a formula for entropy, one must first define entropy somehow. Often it is convenient to start from the Boltzmann formula as a definition, but it's not necessary. For example, one can start from the Shannon formula
$$S=-\sum_{i=1}^N p_i \ln p_i$$
as a definition and then obtain the Boltzmann formula as follows. A priori all the probabilities ##p_i## are the same, so from the constraint ##\sum_{i=1}^N p_i=1## it follows that ##p_i=1/N##. Inserting this into the Shannon formula, one gets
$$S=\ln N$$
which is almost the Boltzmann formula. To get the Boltzmann formula one has to change the units, i.e. redefine entropy as ##S=k\ln N##, and define ##W=N##.
 
  • Like
Likes vanhees71
  • #13
Demystifier said:
Well, yes and no. To derive a formula for entropy, one must first define entropy somehow. Often it is convenient to start from the Boltzmann formula as a definition, but it's not necessary. For example, one can start from the Shannon formula
$$S=-\sum_{i=1}^N p_i \ln p_i$$
as a definition and then obtain the Boltzmann formula as follows. A priori all the probabilities ##p_i## are the same, so from the constraint ##\sum_{i=1}^N p_i=1## it follows that ##p_i=1/N##. Inserting this into the Shannon formula, one gets
$$S=\ln N$$
which is almost the Boltzmann formula. To get the Boltzmann formula one has to change the units, i.e. redefine entropy as ##S=k\ln N##, and define ##W=N##.
What does N and pi stand for in Shannon's formula?
 
  • #14
Philip Koeck said:
What does N and pi stand for in Shannon's formula?
##N## is the number of different microscopic states, ##p_i## are the probabilities of those states.
 
  • Like
Likes Philip Koeck
  • #15
Demystifier said:
##N## is the number of different microscopic states, ##p_i## are the probabilities of those states.
So N was really W to start with?
 
  • #16
Philip Koeck said:
So N was really W to start with?
Yes.
 
  • Like
Likes Philip Koeck
  • #17
If you apply Boltzmann's formula to an ideal gas of distinguishable particles you get an entropy that's not extensive, I believe.

A quick explanation: For an ideal gas of indistinguishable particles you get an extensive entropy.
At low occupancy Wd = n! Wi, where Wd is the number of microstates for distinguishable and Wi the same for indistinguishable particles. n is the number of particles.

This means that S can't be extensive for an ideal gas of distinguishable particles if S = k ln Wd.

Do you have a solution to this problem? (Or is it a problem?)
 
  • #18
Philip Koeck said:
If you apply Boltzmann's formula to an ideal gas of distinguishable particles you get an entropy that's not extensive, I believe.
I don't.
Philip Koeck said:
A quick explanation: For an ideal gas of indistinguishable particles you get an extensive entropy.
At low occupancy Wd = n! Wi, where Wd is the number of microstates for distinguishable and Wi the same for indistinguishable particles. n is the number of particles.
So far so good.
Philip Koeck said:
This means that S can't be extensive for an ideal gas of distinguishable particles if S = k ln Wd.
By your formula the entropy increases with ##n##, which implies that it is "extensive" in the sense that it is not intensive. Perhaps by "not extensive" you mean that it does not grow linearly with ##n##. However, by Stirling formula, ##\ln n!## grows approximately linearly with ##n##. For large ##n## we have the Stirling approximation
$$\ln n! = n \ln n -n$$
so the derivative over ##n## of this is ##\ln n##, which is a slowly growing function so can be approximated by a constant. For example, when ##n## doubles then ##\ln n## increases by ##\ln 2 =0.69##, which is a totally negligible number compared to ##n## when ##n## is large.
 
  • Like
Likes Philip Koeck
  • #19
Demystifier said:
I don't.

So far so good.

By your formula the entropy increases with ##n##, which implies that it is "extensive" in the sense that it is not intensive. Perhaps by "not extensive" you mean that it does not grow linearly with ##n##. However, by Stirling formula, ##\ln n!## grows approximately linearly with ##n##. For large ##n## we have the Stirling approximation
$$\ln n! = n \ln n -n$$
so the derivative over ##n## of this is ##\ln n##, which is a slowly growing function so can be approximated by a constant. For example, when ##n## doubles then ##\ln n## increases by ##\ln 2 =0.69##, which is a totally negligible number compared to ##n## when ##n## is large.
Thanks! That's a really helpful answer.
So the only thing that changes is that S gets an offset that is roughly proportional to n.
In total S remains roughly proportional to n (that's what I meant by extensive) even if you decide to apply Boltzmann's formula to distinguishable particles (aerosols maybe?).
 
  • Like
Likes vanhees71 and Demystifier

1. What is the Boltzmann Entropy Formula?

The Boltzmann Entropy Formula is a mathematical equation that relates the amount of disorder or randomness in a system to the number of possible microstates that the system can have. It is often used in thermodynamics and statistical mechanics to calculate the entropy of a system.

2. How is the Boltzmann Entropy Formula derived?

The Boltzmann Entropy Formula is derived from the fundamental principles of statistical mechanics, which describe the behavior of a large number of particles. It is based on the concept that the entropy of a system is proportional to the logarithm of the number of possible microstates that the system can have.

3. What are the variables in the Boltzmann Entropy Formula?

The variables in the Boltzmann Entropy Formula include the Boltzmann constant (k), which relates the average energy of a particle to its temperature, and the number of possible microstates (Ω) that the system can have. The formula also includes the natural logarithm (ln) to calculate the entropy in units of joules per kelvin (J/K).

4. How is the Boltzmann Entropy Formula related to the Second Law of Thermodynamics?

The Boltzmann Entropy Formula is related to the Second Law of Thermodynamics, which states that the total entropy of a closed system will always increase over time. This is because the formula shows that as the number of possible microstates increases, so does the entropy of the system. Therefore, over time, the system becomes more disordered and its entropy increases.

5. What are the practical applications of the Boltzmann Entropy Formula?

The Boltzmann Entropy Formula has many practical applications in various fields, including thermodynamics, statistical mechanics, and information theory. It is used to calculate the entropy of a system, which can help in understanding the behavior of physical systems and predicting their future states. It is also used in the study of information and communication systems, where it relates to the amount of uncertainty or randomness in a given message or signal.

Similar threads

  • Thermodynamics
Replies
1
Views
696
  • Thermodynamics
Replies
4
Views
2K
Replies
22
Views
1K
Replies
2
Views
816
Replies
3
Views
1K
Replies
5
Views
1K
  • Thermodynamics
Replies
2
Views
945
Replies
17
Views
1K
  • Thermodynamics
Replies
3
Views
800
Back
Top