- #1

- 246

- 38

This equation is used everywhere in statistical thermodynamics and I saw it in the derivation of Gibbs entropy. However, I can't find the derivation of this equation or where does it come from. Do you know a source?

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- I
- Thread starter Dario56
- Start date

- #1

- 246

- 38

This equation is used everywhere in statistical thermodynamics and I saw it in the derivation of Gibbs entropy. However, I can't find the derivation of this equation or where does it come from. Do you know a source?

- #2

Gold Member

- 947

- 810

- #3

- 246

- 38

You know maybe where can derivation of this equation be found? I can't find it anywhere.

- #4

Gold Member

- 947

- 810

- #5

- 5,695

- 2,475

So it is actually a definition, it cannot be proved by other principles/theorems?

- #6

- 574

- 148

I would also like to find out more about that.So it is actually a definition, it cannot be proved by other principles/theorems?

It seems clear that entropy is proportional to some probability or weight W, but what exactly does W stand for and why?

- #7

- 574

- 148

It's more like something proportional to the probability of a macro-state.

As far as I know it's simple for indistinguishable particles: W is just the number of ways a macro state can be realized. Entropy defined like that becomes an extensive quantity.

Robert Swendsen, mentioned in the article shared by Lord Jestocost, also wants an extensive definition of statistical entropy for distinguishable particles. He defines entropy as S = k

For example f(N) could be 1/N! Other researchers do the same.

I believe the only justification is that statistical entropy has the same properties as thermodynamic entropy. I don't know of any derivation from something else.

- #8

- 22,782

- 13,711

A. Katz, Principles of Statistical Mechanics, W. H. Freeman and Company, San Francisco and London (1967).

The point is that thermodynamics can be derived from the underlying microscopic physics using statistical methods. It may sound a bit strange, but one should be aware that from a conceptional point of view quantum statistical physics is simpler than classical statistical physics. The latter can be easily derived from the former as an appropriate approximation for situations, where this approximation is justified (i.e., occupation numbers for each microstate that are much smaller than 1 on average; e.g., for large temperatures and small densities in the equilibrium case).

- #9

- 574

- 148

Do you mean "... occupation numbers for each quantum state (or single particle state) that are much smaller than 1 on average ..."?... quantum statistical physics is simpler than classical statistical physics. The latter can be easily derived from the former as an appropriate approximation for situations, where this approximation is justified (i.e., occupation numbers for each microstate that are much smaller than 1 on average; e.g., for large temperatures and small densities in the equilibrium case).

I'm not sure I can see how low occupation turns quantum particles into classical ones.

For example photons in a cavity should remain indistinguishable and non-classical even at low occupancy, or not?

If we apply the low occupancy limit to the expression for W we don't get the same result for distinguishable and indistinguishable particles either. The results differ by a factor N!.

(See https://www.researchgate.net/publication/330675047_An_introduction_to_the_statistical_physics_of_distinguishable_and_indistinguishable_particles_in_an_ideal_gas)

- #10

- 22,782

- 13,711

$$f=\frac{1}{\exp[\vec{p}^2/(2mT)-\mu/T] \mp 1}.$$

If the exponential is much larger than ##1##, i.e., if ##\mu<0## with ##|\mu|/T \gg 1## (dilute limit) you can neglect the ##\pm 1## and get the classical Maxwell-Boltzmann distribution.

The ##1/N!## is a feature, not a bug. We had this discussion already some time ago in this forum!

- #11

- 574

- 148

No problem with that.

$$f=\frac{1}{\exp[\vec{p}^2/(2mT)-\mu/T] \mp 1}.$$

If the exponential is much larger than ##1##, i.e., if ##\mu<0## with ##|\mu|/T \gg 1## (dilute limit) you can neglect the ##\pm 1## and get the classical Maxwell-Boltzmann distribution.

The ##1/N!## is a feature, not a bug. We had this discussion already some time ago in this forum!

The original question was how one can motivate S = k ln(W) and I would say a major part of this question is what W stands for.

If W is the number of microstates for a given macrostate then there is a factor N! difference between distinguishable and indistinguishable particles in the low occupancy limit.

This means that S becomes non-extensive for distinguishable particles.

I can see several possible solutions for this, for example:

There are no distinguishable particles.

W is not the number of microstates.

The Boltzmann-formula has to be modified.

Entropy doesn't need to be extensive.

Last edited:

- #12

- 13,621

- 6,036

Well, yes and no. To derive a formula for entropy, one must first define entropy somehow. Often it is convenient to start from the Boltzmann formula as a definition, but it's not necessary. For example, one can start from the Shannon formulaSo it is actually a definition, it cannot be proved by other principles/theorems?

$$S=-\sum_{i=1}^N p_i \ln p_i$$

as a definition and then obtain the Boltzmann formula as follows. A priori all the probabilities ##p_i## are the same, so from the constraint ##\sum_{i=1}^N p_i=1## it follows that ##p_i=1/N##. Inserting this into the Shannon formula, one gets

$$S=\ln N$$

which is almost the Boltzmann formula. To get the Boltzmann formula one has to change the units, i.e. redefine entropy as ##S=k\ln N##, and define ##W=N##.

- #13

- 574

- 148

What does N and pWell, yes and no. To derive a formula for entropy, one must first define entropy somehow. Often it is convenient to start from the Boltzmann formula as a definition, but it's not necessary. For example, one can start from the Shannon formula

$$S=-\sum_{i=1}^N p_i \ln p_i$$

as a definition and then obtain the Boltzmann formula as follows. A priori all the probabilities ##p_i## are the same, so from the constraint ##\sum_{i=1}^N p_i=1## it follows that ##p_i=1/N##. Inserting this into the Shannon formula, one gets

$$S=\ln N$$

which is almost the Boltzmann formula. To get the Boltzmann formula one has to change the units, i.e. redefine entropy as ##S=k\ln N##, and define ##W=N##.

- #14

- 13,621

- 6,036

##N## is the number of different microscopic states, ##p_i## are the probabilities of those states.What does N and p_{i}stand for in Shannon's formula?

- #15

- 574

- 148

So N was really W to start with?##N## is the number of different microscopic states, ##p_i## are the probabilities of those states.

- #16

- 13,621

- 6,036

Yes.So N was really W to start with?

- #17

- 574

- 148

A quick explanation: For an ideal gas of indistinguishable particles you get an extensive entropy.

At low occupancy W

This means that S can't be extensive for an ideal gas of distinguishable particles if S = k ln W

Do you have a solution to this problem? (Or is it a problem?)

- #18

- 13,621

- 6,036

I don't.If you apply Boltzmann's formula to an ideal gas of distinguishable particles you get an entropy that's not extensive, I believe.

So far so good.A quick explanation: For an ideal gas of indistinguishable particles you get an extensive entropy.

At low occupancy W_{d}= n! W_{i}, where W_{d}is the number of microstates for distinguishable and W_{i}the same for indistinguishable particles. n is the number of particles.

By your formula the entropy increases with ##n##, which implies that it is "extensive" in the sense that it is not intensive. Perhaps by "not extensive" you mean that it does not growThis means that S can't be extensive for an ideal gas of distinguishable particles if S = k ln W_{d}.

$$\ln n! = n \ln n -n$$

so the derivative over ##n## of this is ##\ln n##, which is a slowly growing function so can be approximated by a constant. For example, when ##n## doubles then ##\ln n## increases by ##\ln 2 =0.69##, which is a totally negligible number compared to ##n## when ##n## is large.

- #19

- 574

- 148

Thanks! That's a really helpful answer.I don't.

So far so good.

By your formula the entropy increases with ##n##, which implies that it is "extensive" in the sense that it is not intensive. Perhaps by "not extensive" you mean that it does not growlinearlywith ##n##. However, by Stirling formula, ##\ln n!## growsapproximatelylinearly with ##n##. For large ##n## we have the Stirling approximation

$$\ln n! = n \ln n -n$$

so the derivative over ##n## of this is ##\ln n##, which is a slowly growing function so can be approximated by a constant. For example, when ##n## doubles then ##\ln n## increases by ##\ln 2 =0.69##, which is a totally negligible number compared to ##n## when ##n## is large.

So the only thing that changes is that S gets an offset that is roughly proportional to n.

In total S remains roughly proportional to n (that's what I meant by extensive) even if you decide to apply Boltzmann's formula to distinguishable particles (aerosols maybe?).

Share:

- Replies
- 1

- Views
- 302

- Replies
- 7

- Views
- 540

- Replies
- 3

- Views
- 480

- Replies
- 1

- Views
- 489

- Replies
- 17

- Views
- 567

- Replies
- 4

- Views
- 208

- Replies
- 8

- Views
- 441

- Replies
- 78

- Views
- 2K

- Replies
- 6

- Views
- 519

- Replies
- 3

- Views
- 433