Reading course in statistical physics

Click For Summary

Discussion Overview

The discussion revolves around concepts from statistical physics, specifically focusing on the interpretation of quantum and classical states, the properties of indistinguishable particles, and the implications of equilibrium in statistical mechanics as presented in the textbook "Statistical Physics" by Reif. Participants are examining specific sections and equations from the text, raising questions about the definitions and probabilities associated with different states of a system.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants question whether classical descriptions require more information than quantum descriptions, noting a discrepancy in the text regarding the number of required coordinates.
  • There is a discussion about the nature of quantum states, particularly the distinction between specifying position coordinates and momentum in quantum mechanics versus classical mechanics.
  • Participants explore the implications of indistinguishable particles on the probabilities of different states, particularly in the context of bosons and fermions.
  • One participant proposes that for a system of two indistinguishable particles in two states, the probabilities of the states should be equal, while another suggests an alternative distribution based on indistinguishability.
  • There is a debate about whether the statements regarding equilibrium and probabilities in the text are postulates or can be derived, with a participant seeking a more general proof of these concepts.
  • Some participants reference the maximum-entropy principle and its relation to equilibrium, discussing its derivation and implications in statistical mechanics.

Areas of Agreement / Disagreement

Participants express differing views on the probabilities associated with indistinguishable particles and the interpretation of equilibrium in statistical mechanics. There is no consensus on the correct probability distribution for the states discussed, and the nature of the postulates in the textbook remains contested.

Contextual Notes

Participants note that the discussion is limited to the specific cases presented in the textbook and that the interpretations may vary based on the definitions and assumptions made regarding indistinguishable particles and equilibrium conditions.

  • #31
vanhees71 said:
From this
$$Y' \partial_x \sigma = X' \partial_y \sigma,$$
and this is a function of ##(x,y)##, which I can write in the Form ##X' Y'/\tau##. This shows that for functions with 2 independent variables for any inexact differential there's always an integrating factor, ##\tau##, such that
$$\mathrm{d} G/\tau=\mathrm{d} \sigma$$
is an exact differential.
Got it. The two expressions are the same so I can replace both with a 3rd expression.
The integration factor τ, which depends on x and y in general, is used to make the replacement correct. Something like that.

It will be interesting to read in the next chapter how this is applied to dS = d-Q / T.
(Couldn't find a crossed-out d.)
Somehow one also has to realize that this equation is only valid for a d-Q in a reversible process.
 
  • Like
Likes   Reactions: vanhees71
Science news on Phys.org
  • #32
I have a rather general question about the definition of entropy used in chapter 5:
S = k ln Ω, where Ω is the number of available microstates.
Boltzmann wrote W rather than Ω, and I believe this stood for probability (Wahrscheinlichkeit).
Obviously this is not a number between 0 and 1, so it's more like something proportional to probability.

Probability would be number of available microstates divided by total number of microstates (including those that are not available).
Now for distinguishable particles both these numbers are bigger than for indistinguishable particles, by a factor N!, where N is the number of particles, in the case of low occupancy.

Would it make sense therefore to use the following definition of entropy for distinguishable particles to make sure that this "probability" W is calculated correctly?
S = k ln (Ω / N!) for distinguishable particles at low occupancy.
 
  • #33
Philip Koeck said:
I have a rather general question about the definition of entropy used in chapter 5:
S = k ln Ω, where Ω is the number of available microstates.
Boltzmann wrote W rather than Ω, and I believe this stood for probability (Wahrscheinlichkeit).
Obviously this is not a number between 0 and 1, so it's more like something proportional to probability.

Probability would be number of available microstates divided by total number of microstates (including those that are not available).
Now for distinguishable particles both these numbers are bigger than for indistinguishable particles, by a factor N!, where N is the number of particles, in the case of low occupancy.

Would it make sense therefore to use the following definition of entropy for distinguishable particles to make sure that this "probability" W is calculated correctly?
S = k ln (Ω / N!) for distinguishable particles at low occupancy.
Maybe I should move this to a new post.
 

Similar threads

  • · Replies 29 ·
Replies
29
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 4 ·
Replies
4
Views
19K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
951
Replies
0
Views
1K
Replies
4
Views
812
  • · Replies 6 ·
Replies
6
Views
3K
Replies
4
Views
682