Phase Cell Elementary volume

In summary: Gibbs paradox is a puzzle in quantum mechanics that concerns the statistical behavior of particles in situations where they are not directly observable. It is named after physicist David Gibbs, who first posed the paradox in a paper in 1948.
  • #1
Sheldon Cooper
21
0
Hello,
Am having this confusion that in Boltzmann approach to Statistical Mechanics, the phase space was divided into small phase cells whose magnitude was of the order of (h^f) but Boltzmann also made the assumption that the smallest phase cell must contain a large number of atoms. Doesn't this seem a contradictory statement as (h ~ 10^(-34)) and for a simple mono-atomic gas the degrees of freedom (f = 3). So the phase cell element volume seems to be order of (10^(-102)) but the atomic diameter is of the Angstrom order. Thanks in advance.
 
Physics news on Phys.org
  • #2
That's of course not Boltzmann! He had big trouble with this point, i.e., in statistical mechanics you need some physical measure of phase-space cells, but there is no such thing available in classical physics. So this riddle got solved only with the discovery of modern quantum theory.

Take the example of an ideal gas and consider particles in some finite volume. We take a cube of length ##L##, because the shape is not important in the "thermodynamic limit" we'll take later. Then to keep the calculation simple, consider periodic boundary conditions, i.e., we describe free particles in the space of square integrable wave functions which are periodic with period ##L## in all three directions.

As a complete set of single-particle states we can use the momentum eigenstates, which are given by the plane waves
$$u_{\vec{p}}(\vec{x})=\frac{1}{L^{3/2}} \exp(\mathrm{i} \vec{p} \cdot \vec{x}/\hbar).$$
Now to fulfill the periodic boundary conditions, the momenta are discrete, taking the values
$$\vec{p}(\vec{n})=\frac{2 \pi \hbar}{L} \vec{n}, \quad \vec{n} \in \mathbb{Z}^3.$$
Now we want to make the volume very large, taking the limit ##L \rightarrow \infty##. For very large ##L## the ##\vec{p}## become quasi continuous. So it is more convenient to count the number of states in a given momentum interval ##\mathrm{d}^3 \vec{p}##, and obviously that's
$$\mathrm{d} \rho=\mathrm{d}^3 \vec{p} \frac{L^3}{(2 \pi \hbar)^3}=\frac{V}{(2 \pi)^3} \mathrm{d}^3 \vec{p}.$$
That's where the phase-space measure ##\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}/(2 \pi \hbar)^3## comes from in classical statistics, where you borrough this argument from quantum mechanics.

Quantum mechanics also solves the riddle about Gibb's paradox, because you also borrough the idea of indistinguishability of particles of the same sort from quantum theory, which introduces the crucial factor ##1/N!## (with ##N## is the number of particles considered) when using the counting method to get the number of microstates ##\Omega## for a given macro state in Boltzmann's equation ##S=k_B \ln \Omega##. For details, see my lecture notes on transport theory (although it's relativistic, there's not much difference to the non-relativistic case here):

http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf
 
  • Like
Likes Demystifier, Mentz114 and Sheldon Cooper
  • #3
Thanx for the explanation :)
It really helped a lott
 
  • #4
Sheldon Cooper said:
Hello,
Am having this confusion that in Boltzmann approach to Statistical Mechanics, the phase space was divided into small phase cells whose magnitude was of the order of (h^f) but Boltzmann also made the assumption that the smallest phase cell must contain a large number of atoms. Doesn't this seem a contradictory statement as (h ~ 10^(-34)) and for a simple mono-atomic gas the degrees of freedom (f = 3). So the phase cell element volume seems to be order of (10^(-102)) but the atomic diameter is of the Angstrom order. Thanks in advance.

Boltzmann worked mostly, I think, with ##\mu## space, not phase space. ##\mu## space is a six-dimensional space where each molecule is represented by a point, whose coordinates are the position and momentum components of the molecule. When Boltzmann divided continuous volume or energy range into cells, he meant each cell/interval to contain many molecules so the continuum description (density of molecules per unit volume/interval) works well. Boltzmann didn't anywhere use the Planck constant ##h##.

Phase space is a different thing, it is ##6N##-dimensional and all the molecules (the whole system) are represented just by one point. This allows for more general and powerful theory.

The division of the phase space into cells of volume ##h^{3N}## was introduced as a later modification for reasons unclear to me. In most calculations of classical statistical physics the presence and value of ##h## has no impact on the value of expected averages of physical quantities.
 
  • #5
vanhees71 said:
in statistical mechanics you need some physical measure of phase-space cells

Could you give some example?

Quantum mechanics also solves the riddle about Gibb's paradox, because you also borrough the idea of indistinguishability of particles of the same sort from quantum theory, which introduces the crucial factor ##1/N!## (with ##N## is the number of particles considered) when using the counting method to get the number of microstates ##\Omega## for a given macro state in Boltzmann's equation ##S=k_B \ln \Omega##.
Which riddle is that? Please explain in your own words, do not link to Wikipedia.
 
  • #6
The phase space (either of a single particle, often called ##\mu## space as you've explained above or of the entire system of ##N## particles, called ##\Gamma## space, which I think goes back to the famous article by Ehrenfest&Ehrenfest) is a continuous space of states, and thus the probabilities to find a particle or the system of particles in a specific configuration is discribed by a probability distribution, i.e., a function with dimension ##\text{action}^3## or ##\text{action}^{3N}##. In classical physics the only place where the action occurs is in Hamilton's principle of least action, but there is no specific natural measure of it. So to define the entropy, which is related to the logarithm of the probabilities (a la Shannon and Jaynes) you need to introduce an arbitrary phase-space measure. In quantum theory one introduces Planck's constant (in modern physics usually the modified Planck constant ##\hbar=h/(2 \pi)##.

The Gibbs paradox is the following. If in the naive Bolzmann statistics you treat the particles as individually distinguishable, it makes a difference whether particle 1 is in a phase-space cell 1 and particle 2 is in phase-space cell 2 or if particle 1 is in cell 2 and particle 2 in phase-space cell 1. Now consider a box which is divided by some diagphragm, so that particles of a gas in the left half of the box cannot change to the right half. Suppose further that the gas in both halves is in thermal equilibrium at the same temperature and pressure, i.e., in the same macro state. If you now take out the diaphragn without adding any energy, momentum etc. to the box, the macrostate of the gas doesn't change, i.e., it's temperature and pressure stays as before, but if you calculate the entropy using the Boltzmann statistics under the assumption of distinguishable identical particles, the entropy gets bigger when the diaphragm is taken away, because now it is possible for the particles to change from one half of the box to the other, which they couldn't before. This is of course wrong, because entropy by definition is a state variable and must not depend on the history of the system which brought it finally into the equilibrium situation discussed.

Indeed, doing the semiclassical statistics, taking into account the indistinguishability of identical particles (atoms, molecules...) you get the correct entropy formula, which is an extensive state variable as it should be (Sackur Tretrode formula). For details, see the manuscript (about relativistic transport theory, but it's the same also from non-relativistic statistical physics):

http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf
 
  • #7
vanhees71 said:
The phase space (either of a single particle, often called ##\mu## space as you've explained above or of the entire system of ##N## particles, called ##\Gamma## space, which I think goes back to the famous article by Ehrenfest&Ehrenfest) is a continuous space of states, and thus the probabilities to find a particle or the system of particles in a specific configuration is discribed by a probability distribution, i.e., a function with dimension ##\text{action}^3## or ##\text{action}^{3N}##. In classical physics the only place where the action occurs is in Hamilton's principle of least action, but there is no specific natural measure of it.

I think the phase space was introduced and used a lot by Gibbs before Ehrenfests' article.

Probability is dimensionless number, so if it is to be given by

$$
\int\,\rho\,dq_1dp_1...dq_{3N}dp_{3N}
$$
the probability distribution ##\rho## has to have dimension ##(kgm^2s^{-1})^{-3N}##.


So to define the entropy, which is related to the logarithm of the probabilities (a la Shannon and Jaynes) you need to introduce an arbitrary phase-space measure. In quantum theory one introduces Planck's constant (in modern physics usually the modified Planck constant ##\hbar=h/(2 \pi)##.

Why do we need to introduce arbitrary phase-space measure? What is wrong with the formula

$$
I[\rho] = \int -\rho\ln \rho\, dq_1dp_1...dq_{3N}dp_{3N}
$$
? Logarithm is dimensionless regardless of units in which ##q##'s,##p##'s are measured and ##I## is dimensionless. The only effect units have on ##I## is arbitrary shift of value and additive constant makes no difference in use of ##I##.
but if you calculate the entropy using the Boltzmann statistics under the assumption of distinguishable identical particles, the entropy gets bigger when the diaphragm is taken away, because now it is possible for the particles to change from one half of the box to the other, which they couldn't before. This is of course wrong, because entropy by definition is a state variable and must not depend on the history of the system which brought it finally into the equilibrium situation discussed.

You are comparing two different entropies. It is unfortunate we call many different things with the same name.

The first entropy ##A## is defined as logarithm of volume of phase space compatible with the macrostate. This indeed gets bigger as the constraint is removed and there is no reason why this should not happen in classical physics. Quantity ##A## is not additive; ##A(kU,kV,kN)## is not necessarily equal to ##kA(U,V,N)##.

The second entropy ##B## is thermodynamic entropy defined by integral of ##dQ/T##, which by convention (thus, it does not follow from any physical law) is assumed to be additive. That is, by convention ##B(kU,kV,kN) = kB(U,V,N)##. This assumption is very useful in practice.

A connection between these two entropies can be made. An additive quantity ##A'## can be defined based on ##A## and ##N##-dependent divisor, which gives the same function of macroscopic variables as ##B## is.

$$
B = A' = A/f(N)
$$

Presence of the factor ##f(N)## in this formula is necessitated by the convention imposed on ##B##. This is macroscopic convention. It has no consequences for distinguishability of microscopic particles in principle.
 
  • #8
Any formula, where there is a dimensionful quantity under a logarithm, exponential, trig. function, etc. is a priori wrong for obvious reasons!

The correctly defined statistical entropy is identical with the thermodynamics one, and it's additive too (for the case, where Gibbs statistics is applicable, which is not the case for systems that are correlated via long-range interactions, but that's another story).
 
  • #9
vanhees71 said:
Any formula, where there is a dimensionful quantity under a logarithm, exponential, trig. function, etc. is a priori wrong for obvious reasons!

If there are obvious reasons, could you please give them? I do not see how taking logarithm of dimensionful quantity creates any problem.
True, this makes the value of entropy dependent on the units chosen, but this dependence is just additive constant. Introducing arbitrary unit of action into logarithm does not change that.

The correctly defined statistical entropy is identical with the thermodynamics one, and it's additive too

If you wish statistical entropy to give the same value as thermodynamic entropy, that's possible by introducing the ##N##-dependent factor. But the value of thermodynamic entropy and its additivity is purely conventional. I see no implication for fundamental distinguishability of particles.
 
  • #10
How do you define the logarithm of a dimensionful quantity? It's easier to explain for the exponential function, which is defined by
$$\exp x=\sum_{k=0}^{\infty} \frac{x^k}{k!}.$$
If now ##x## is, say of dimension length, what sense do you make of this infinite series? You'd have to add quantities which are dimensionless (##k=0##), of dimension length (##k=1##), ##\mathrm{length}^2## (##k=2##), and so on. It's simply not defined!

For the logarithm you can use the series
$$\ln(1+x)=\sum_{k=0}^{\infty} \frac{(-1)^k}{k!} x^{k+1},$$
from which it's also clear that ##x## must be dimensionless.

The point is that particles are indistinguishable, and thus you have an additional ##1/N!## in the counting rules a la Boltzmann. See

http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf

p. 30.
 
  • #11
vanhees71 said:
How do you define the logarithm of a dimensionful quantity? It's easier to explain for the exponential function, which is defined by
$$\exp x=\sum_{k=0}^{\infty} \frac{x^k}{k!}.$$
If now ##x## is, say of dimension length, what sense do you make of this infinite series?
The only sense I see is ##e^{x}##, whose value depends on the unit of length chosen. It is an artificial quantity with no obvious utility.

For the logarithm you can use the series
$$\ln(1+x)=\sum_{k=0}^{\infty} \frac{(-1)^k}{k!} x^{k+1},$$
from which it's also clear that ##x## must be dimensionless.

The above formula gives value of logarithm only for ##1+x## in the interval ##(0,2]##. For other values it fails.

This is not important for definition of entropy, because we have other methods to calculate logarithm for any positive argument. The chosen unit of the argument has effect on the value of the function, but not on its dependence on ##x##. For any ##k##

$$
\ln (kx) = \ln x + \ln k
$$

and since additive constants do not matter in thermodynamics, the definitions

$$
A = \ln \Gamma
$$
and
$$
I = \int -\rho\,\ln\rho\,d\Gamma
$$
are fine for any unit of ##q,p## chosen. The units only shift the value of these quantities by a constant number which is immaterial.
The point is that particles are indistinguishable, and thus you have an additional ##1/N!## in the counting rules a la Boltzmann. See

http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf

p. 30.

No, this is not point of our discussion. I am doubting your claim that quantum mechanics solves the riddle about Gibbs' paradox. I do not think there is any riddle, there is only trivial mismatch between two formulae from different theories. Invoking fundamental indistinguishability of particles as a way to remove this mismatch is misguided. Not because particles are distinguishable, but because the mismatch is just a convention with no consequences on the values of probability or average values of measurable quantities and needs no removal.
 
  • #12
Again, how do you define the logarithm of a dimensionful quantity?

Again, I can't help it, if you don't want to read a simple derivation about the additivity of the Gibbs entropy, using the concept of indistinguishability of particles. It's a well known fact since Boltzmann that the additional factor ##1/N!## leads to the correct additive entropy. If you don't believe in my manuscript, look it up at Wikipedia:

https://en.wikipedia.org/wiki/Gibbs_paradox

The disadvantage of this article, however, is that they also use dimensionful quantities as arguments in logarithms, which doesn't make any sense, as clearly demonstrated in my previous posting to this thread.
 
  • #13
vanhees71 said:
Again, how do you define the logarithm of a dimensionful quantity?

General definition of logarithm is well known, it is already defined for any positive number. For example, ##\ln (13 \text{J.s}) = 2.565##. True, result depends on the unit chosen. True, the dependence is just an additive constant which does not matter in thermodynamics.

Again, I can't help it, if you don't want to read a simple derivation about the additivity of the Gibbs entropy, using the concept of indistinguishability of particles. It's a well known fact since Boltzmann that the additional factor ##1/N!## leads to the correct additive entropy. If you don't believe in my manuscript, look it up at Wikipedia:

https://en.wikipedia.org/wiki/Gibbs_paradox

The disadvantage of this article, however, is that they also use dimensionful quantities as arguments in logarithms, which doesn't make any sense, as clearly demonstrated in my previous posting to this thread.

I know this derivation. Indeed, indistiguishability is being used in textbooks as a crucial element when introducing statistical physics formula for entropy ##\ln (W/N!)##. I know it leads to conventional additive entropy which is often thought to be "the correct one".

My point is the whole argument with indistinguishability is a misconception, because there never was "correct entropy" in the first place to be obtained via statistical physics method. Entropy can be defined additive or non-additive in thermodynamics. It is just a convention.

Still, I read it again. It is the same misconception as I thought.

I claim this whole argument is misguided and unneeded. My reasons are clearly stated above. If you are willing to question what you think you know, address the arguments and I will respond. Linking to Wikipedia at this point just says that you are not interested in learning anything that conflicts with what you think you know.
 
  • #14
From the text linked above:

Following Boltzmann and Planck the entropy of the system for a given distribution of the $N$ particles in phase space, given by the numbers $N_j$ of particles in the phase-space cell $d^6\xi_j$, is defined as

$$S = \sum_j \ln d\Gamma_j = \sum_j [N_j \ln G_j −\ln(N_j!)] \simeq \sum_j[N_j\ln G_j - N_j(\ln G_j-1)].~~~~~~(1.5.3)$$

.
.
.

Thus the entropy of a dilute gas is given in terms of phase-space by the semiclassical expression
$$
S(t) = -\frac{g}{(2\pi\hbar)^3} \int d^3\vec{x},d^3\vec{p}\,f(x,p)\{\ln [f(x,p)] - 1\}.~~~~~~(1.5.8)
$$

Not important for my point, but just so you know: what you defined is a variant of Boltzmann's ##H## function with opposite sign. Generally, this is not equal to thermodynamic entropy, which needs to be defined over whole phase space, not 6D ##\mu## space.

There are cases where your function decreases while the system is approaching equilibrium. See my post

https://www.physicsforums.com/threads/deriving-boltzmanns-distribution.781150/#post-4911643
 
Last edited:
  • #15
You simply cannot define the logarithm of a dimensionful quantity. What you've written is self-contradicting. The exponential of a number is a number and not the dimensionful argument of the logarithm. I also can't help you, when you don't recognize the valid definitions of entropy in statistical mechanis. What I've given is the correct definition of the entropy, and it is precisely the thermodynamical entropy for the equilibrium case.
 

1. What is a Phase Cell Elementary Volume?

The Phase Cell Elementary Volume is a unit of measurement used in thermodynamics and material science to describe the volume of a single phase within a material or system. It is typically denoted as Vφ.

2. How is Phase Cell Elementary Volume calculated?

The Phase Cell Elementary Volume is calculated by taking the total volume of the material or system and dividing it by the number of phases present. For example, if a material has a total volume of 100 cm3 and contains two phases, the Phase Cell Elementary Volume would be 50 cm3.

3. What is the significance of Phase Cell Elementary Volume?

The Phase Cell Elementary Volume is important because it allows scientists to analyze the properties and behavior of individual phases within a material or system. This can provide valuable insights into the overall behavior and performance of the material.

4. How does Phase Cell Elementary Volume relate to phase transitions?

Phase Cell Elementary Volume is directly related to phase transitions, as it is used to describe the volume of a single phase within a material. During a phase transition, the Phase Cell Elementary Volume may change as the material transitions from one phase to another.

5. Can Phase Cell Elementary Volume vary within a single material?

Yes, Phase Cell Elementary Volume can vary within a single material. This is because different regions or areas within the material may have different compositions or structures, leading to variations in the number or size of phases present and therefore the Phase Cell Elementary Volume.

Similar threads

Replies
1
Views
900
Replies
1
Views
3K
Replies
1
Views
2K
  • Advanced Physics Homework Help
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
4
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
4K
  • Introductory Physics Homework Help
Replies
6
Views
1K
Replies
6
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Atomic and Condensed Matter
Replies
3
Views
2K
Back
Top