Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Phase Cell Elementary volume

  1. Oct 16, 2015 #1
    Hello,
    Am having this confusion that in Boltzmann approach to Statistical Mechanics, the phase space was divided into small phase cells whose magnitude was of the order of (h^f) but Boltzmann also made the assumption that the smallest phase cell must contain a large number of atoms. Doesn't this seem a contradictory statement as (h ~ 10^(-34)) and for a simple mono-atomic gas the degrees of freedom (f = 3). So the phase cell element volume seems to be order of (10^(-102)) but the atomic diameter is of the Angstrom order. Thanks in advance.
     
  2. jcsd
  3. Oct 17, 2015 #2

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    That's of course not Boltzmann! He had big trouble with this point, i.e., in statistical mechanics you need some physical measure of phase-space cells, but there is no such thing available in classical physics. So this riddle got solved only with the discovery of modern quantum theory.

    Take the example of an ideal gas and consider particles in some finite volume. We take a cube of length ##L##, because the shape is not important in the "thermodynamic limit" we'll take later. Then to keep the calculation simple, consider periodic boundary conditions, i.e., we describe free particles in the space of square integrable wave functions which are periodic with period ##L## in all three directions.

    As a complete set of single-particle states we can use the momentum eigenstates, which are given by the plane waves
    $$u_{\vec{p}}(\vec{x})=\frac{1}{L^{3/2}} \exp(\mathrm{i} \vec{p} \cdot \vec{x}/\hbar).$$
    Now to fulfill the periodic boundary conditions, the momenta are discrete, taking the values
    $$\vec{p}(\vec{n})=\frac{2 \pi \hbar}{L} \vec{n}, \quad \vec{n} \in \mathbb{Z}^3.$$
    Now we want to make the volume very large, taking the limit ##L \rightarrow \infty##. For very large ##L## the ##\vec{p}## become quasi continuous. So it is more convenient to count the number of states in a given momentum interval ##\mathrm{d}^3 \vec{p}##, and obviously that's
    $$\mathrm{d} \rho=\mathrm{d}^3 \vec{p} \frac{L^3}{(2 \pi \hbar)^3}=\frac{V}{(2 \pi)^3} \mathrm{d}^3 \vec{p}.$$
    That's where the phase-space measure ##\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}/(2 \pi \hbar)^3## comes from in classical statistics, where you borrough this argument from quantum mechanics.

    Quantum mechanics also solves the riddle about Gibb's paradox, because you also borrough the idea of indistinguishability of particles of the same sort from quantum theory, which introduces the crucial factor ##1/N!## (with ##N## is the number of particles considered) when using the counting method to get the number of microstates ##\Omega## for a given macro state in Boltzmann's equation ##S=k_B \ln \Omega##. For details, see my lecture notes on transport theory (although it's relativistic, there's not much difference to the non-relativistic case here):

    http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf
     
  4. Oct 17, 2015 #3
    Thanx for the explanation :)
    It really helped a lott
     
  5. Oct 18, 2015 #4

    Jano L.

    User Avatar
    Gold Member

    Boltzmann worked mostly, I think, with ##\mu## space, not phase space. ##\mu## space is a six-dimensional space where each molecule is represented by a point, whose coordinates are the position and momentum components of the molecule. When Boltzmann divided continuous volume or energy range into cells, he meant each cell/interval to contain many molecules so the continuum description (density of molecules per unit volume/interval) works well. Boltzmann didn't anywhere use the Planck constant ##h##.

    Phase space is a different thing, it is ##6N##-dimensional and all the molecules (the whole system) are represented just by one point. This allows for more general and powerful theory.

    The division of the phase space into cells of volume ##h^{3N}## was introduced as a later modification for reasons unclear to me. In most calculations of classical statistical physics the presence and value of ##h## has no impact on the value of expected averages of physical quantities.
     
  6. Oct 18, 2015 #5

    Jano L.

    User Avatar
    Gold Member

    Could you give some example?

    Which riddle is that? Please explain in your own words, do not link to Wikipedia.
     
  7. Oct 18, 2015 #6

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    The phase space (either of a single particle, often called ##\mu## space as you've explained above or of the entire system of ##N## particles, called ##\Gamma## space, which I think goes back to the famous article by Ehrenfest&Ehrenfest) is a continuous space of states, and thus the probabilities to find a particle or the system of particles in a specific configuration is discribed by a probability distribution, i.e., a function with dimension ##\text{action}^3## or ##\text{action}^{3N}##. In classical physics the only place where the action occurs is in Hamilton's principle of least action, but there is no specific natural measure of it. So to define the entropy, which is related to the logarithm of the probabilities (a la Shannon and Jaynes) you need to introduce an arbitrary phase-space measure. In quantum theory one introduces Planck's constant (in modern physics usually the modified Planck constant ##\hbar=h/(2 \pi)##.

    The Gibbs paradox is the following. If in the naive Bolzmann statistics you treat the particles as individually distinguishable, it makes a difference whether particle 1 is in a phase-space cell 1 and particle 2 is in phase-space cell 2 or if particle 1 is in cell 2 and particle 2 in phase-space cell 1. Now consider a box which is divided by some diagphragm, so that particles of a gas in the left half of the box cannot change to the right half. Suppose further that the gas in both halves is in thermal equilibrium at the same temperature and pressure, i.e., in the same macro state. If you now take out the diaphragn without adding any energy, momentum etc. to the box, the macrostate of the gas doesn't change, i.e., it's temperature and pressure stays as before, but if you calculate the entropy using the Boltzmann statistics under the assumption of distinguishable identical particles, the entropy gets bigger when the diaphragm is taken away, because now it is possible for the particles to change from one half of the box to the other, which they couldn't before. This is of course wrong, because entropy by definition is a state variable and must not depend on the history of the system which brought it finally into the equilibrium situation discussed.

    Indeed, doing the semiclassical statistics, taking into account the indistinguishability of identical particles (atoms, molecules...) you get the correct entropy formula, which is an extensive state variable as it should be (Sackur Tretrode formula). For details, see the manuscript (about relativistic transport theory, but it's the same also from non-relativistic statistical physics):

    http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf
     
  8. Oct 19, 2015 #7

    Jano L.

    User Avatar
    Gold Member

    I think the phase space was introduced and used a lot by Gibbs before Ehrenfests' article.

    Probability is dimensionless number, so if it is to be given by

    $$
    \int\,\rho\,dq_1dp_1...dq_{3N}dp_{3N}
    $$
    the probability distribution ##\rho## has to have dimension ##(kgm^2s^{-1})^{-3N}##.


    Why do we need to introduce arbitrary phase-space measure? What is wrong with the formula

    $$
    I[\rho] = \int -\rho\ln \rho\, dq_1dp_1...dq_{3N}dp_{3N}
    $$
    ? Logarithm is dimensionless regardless of units in which ##q##'s,##p##'s are measured and ##I## is dimensionless. The only effect units have on ##I## is arbitrary shift of value and additive constant makes no difference in use of ##I##.



    You are comparing two different entropies. It is unfortunate we call many different things with the same name.

    The first entropy ##A## is defined as logarithm of volume of phase space compatible with the macrostate. This indeed gets bigger as the constraint is removed and there is no reason why this should not happen in classical physics. Quantity ##A## is not additive; ##A(kU,kV,kN)## is not necessarily equal to ##kA(U,V,N)##.

    The second entropy ##B## is thermodynamic entropy defined by integral of ##dQ/T##, which by convention (thus, it does not follow from any physical law) is assumed to be additive. That is, by convention ##B(kU,kV,kN) = kB(U,V,N)##. This assumption is very useful in practice.

    A connection between these two entropies can be made. An additive quantity ##A'## can be defined based on ##A## and ##N##-dependent divisor, which gives the same function of macroscopic variables as ##B## is.

    $$
    B = A' = A/f(N)
    $$

    Presence of the factor ##f(N)## in this formula is necessitated by the convention imposed on ##B##. This is macroscopic convention. It has no consequences for distinguishability of microscopic particles in principle.
     
  9. Oct 19, 2015 #8

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    Any formula, where there is a dimensionful quantity under a logarithm, exponential, trig. function, etc. is a priori wrong for obvious reasons!

    The correctly defined statistical entropy is identical with the thermodynamics one, and it's additive too (for the case, where Gibbs statistics is applicable, which is not the case for systems that are correlated via long-range interactions, but that's another story).
     
  10. Oct 19, 2015 #9

    Jano L.

    User Avatar
    Gold Member

    If there are obvious reasons, could you please give them? I do not see how taking logarithm of dimensionful quantity creates any problem.
    True, this makes the value of entropy dependent on the units chosen, but this dependence is just additive constant. Introducing arbitrary unit of action into logarithm does not change that.

    If you wish statistical entropy to give the same value as thermodynamic entropy, that's possible by introducing the ##N##-dependent factor. But the value of thermodynamic entropy and its additivity is purely conventional. I see no implication for fundamental distinguishability of particles.
     
  11. Oct 19, 2015 #10

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    How do you define the logarithm of a dimensionful quantity? It's easier to explain for the exponential function, which is defined by
    $$\exp x=\sum_{k=0}^{\infty} \frac{x^k}{k!}.$$
    If now ##x## is, say of dimension length, what sense do you make of this infinite series? You'd have to add quantities which are dimensionless (##k=0##), of dimension length (##k=1##), ##\mathrm{length}^2## (##k=2##), and so on. It's simply not defined!

    For the logarithm you can use the series
    $$\ln(1+x)=\sum_{k=0}^{\infty} \frac{(-1)^k}{k!} x^{k+1},$$
    from which it's also clear that ##x## must be dimensionless.

    The point is that particles are indistinguishable, and thus you have an additional ##1/N!## in the counting rules a la Boltzmann. See

    http://fias.uni-frankfurt.de/~hees/publ/kolkata.pdf

    p. 30.
     
  12. Oct 19, 2015 #11

    Jano L.

    User Avatar
    Gold Member

    The only sense I see is ##e^{x}##, whose value depends on the unit of length chosen. It is an artificial quantity with no obvious utility.

    The above formula gives value of logarithm only for ##1+x## in the interval ##(0,2]##. For other values it fails.

    This is not important for definition of entropy, because we have other methods to calculate logarithm for any positive argument. The chosen unit of the argument has effect on the value of the function, but not on its dependence on ##x##. For any ##k##

    $$
    \ln (kx) = \ln x + \ln k
    $$

    and since additive constants do not matter in thermodynamics, the definitions

    $$
    A = \ln \Gamma
    $$
    and
    $$
    I = \int -\rho\,\ln\rho\,d\Gamma
    $$
    are fine for any unit of ##q,p## chosen. The units only shift the value of these quantities by a constant number which is immaterial.


    No, this is not point of our discussion. I am doubting your claim that quantum mechanics solves the riddle about Gibbs' paradox. I do not think there is any riddle, there is only trivial mismatch between two formulae from different theories. Invoking fundamental indistinguishability of particles as a way to remove this mismatch is misguided. Not because particles are distinguishable, but because the mismatch is just a convention with no consequences on the values of probability or average values of measurable quantities and needs no removal.
     
  13. Oct 20, 2015 #12

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    Again, how do you define the logarithm of a dimensionful quantity?

    Again, I can't help it, if you don't want to read a simple derivation about the additivity of the Gibbs entropy, using the concept of indistinguishability of particles. It's a well known fact since Boltzmann that the additional factor ##1/N!## leads to the correct additive entropy. If you don't believe in my manuscript, look it up at Wikipedia:

    https://en.wikipedia.org/wiki/Gibbs_paradox

    The disadvantage of this article, however, is that they also use dimensionful quantities as arguments in logarithms, which doesn't make any sense, as clearly demonstrated in my previous posting to this thread.
     
  14. Oct 20, 2015 #13

    Jano L.

    User Avatar
    Gold Member

    General definition of logarithm is well known, it is already defined for any positive number. For example, ##\ln (13 \text{J.s}) = 2.565##. True, result depends on the unit chosen. True, the dependence is just an additive constant which does not matter in thermodynamics.

    I know this derivation. Indeed, indistiguishability is being used in textbooks as a crucial element when introducing statistical physics formula for entropy ##\ln (W/N!)##. I know it leads to conventional additive entropy which is often thought to be "the correct one".

    My point is the whole argument with indistinguishability is a misconception, because there never was "correct entropy" in the first place to be obtained via statistical physics method. Entropy can be defined additive or non-additive in thermodynamics. It is just a convention.

    Still, I read it again. It is the same misconception as I thought.

    I claim this whole argument is misguided and unneeded. My reasons are clearly stated above. If you are willing to question what you think you know, address the arguments and I will respond. Linking to Wikipedia at this point just says that you are not interested in learning anything that conflicts with what you think you know.
     
  15. Oct 20, 2015 #14

    Jano L.

    User Avatar
    Gold Member

    From the text linked above:

    Not important for my point, but just so you know: what you defined is a variant of Boltzmann's ##H## function with opposite sign. Generally, this is not equal to thermodynamic entropy, which needs to be defined over whole phase space, not 6D ##\mu## space.

    There are cases where your function decreases while the system is approaching equilibrium. See my post

    https://www.physicsforums.com/threads/deriving-boltzmanns-distribution.781150/#post-4911643
     
    Last edited: Oct 20, 2015
  16. Oct 21, 2015 #15

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    You simply cannot define the logarithm of a dimensionful quantity. What you've written is self-contradicting. The exponential of a number is a number and not the dimensionful argument of the logarithm. I also can't help you, when you don't recognize the valid definitions of entropy in statistical mechanis. What I've given is the correct definition of the entropy, and it is precisely the thermodynamical entropy for the equilibrium case.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Phase Cell Elementary volume
  1. Phase space volume (Replies: 1)

  2. Elementary Questions (Replies: 5)

  3. Elementary particles (Replies: 1)

Loading...