1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The background measure in Boltzmann measure

  1. Jun 13, 2013 #1
    Suppose a set [itex]X[/itex] describes the possible states of some system, and suppose a function [itex]x\mapsto E(x)[/itex] tells the energy level of each state. At temperature [itex]T[/itex] the Boltzmann-measure, which will be the probability measure describing the state of the system, is obtained by formula

    [tex]
    dp(x) = \frac{1}{Z(T)} e^{-\frac{E(x)}{k_{\textrm{B}}T}} d\mu(x)
    [/tex]

    where [itex]Z(T)[/itex] has been defined by

    [tex]
    Z(T) = \int\limits_X e^{-\frac{E(x)}{k_{\textrm{B}}T}} d\mu(x)
    [/tex]

    and where [itex]\mu[/itex] IS SOME MYSTERIOUS BACKGROUND MEASURE, which seems to avoided in all physics literature.

    For example, if we want to derive the Maxwell-Boltzmann distribution for particles in gas, we denote [itex]v=x[/itex] (since the state of the particle is described by its velocity (or momentum) in this model), set [itex]E(v)=\frac{1}{2}m\|v\|^2[/itex] and [itex]\mu=m_3[/itex], where [itex]m_3[/itex] is the ordinary three dimensional Lebesgue-measure.

    Another example: In Ising model we have [itex]X=\{-1,+1\}^L[/itex] where [itex]L[/itex] is a set whose elements describe lattice points. (Let's assume that [itex]L[/itex] is finite.) Then we define [itex]\mu[/itex] as the number measure so that [itex]\mu(\{x\})=1[/itex], and for non-trivial [itex]A\subset X[/itex] [itex]\mu(A)[/itex] tells the number of elements (states) in [itex]A[/itex]. The energy function [itex]E[/itex] is defined by using information about which points are neighbour points.

    So the Boltzmann-measure consists of two parts. One part is the function [itex]e^{-E/(k_{\textrm{B}}T)}[/itex], and the second part is some background measure. The Boltzmann-measure is obtained, when the background measure is weighted with the function that depends on the energy and temperature.

    Everytime I try to read about statistical physics, I only find discussion about the function [itex]e^{-E/(k_{\textrm{B}}T)}[/itex], but not about the background measure.

    Suppose I define a measure [itex]\mu[/itex] by a formula [itex]d\mu(x)=(e^{-\|x\|} + \sin(\|x\|))dm_3(x)[/itex], and then claim that my Maxwell-Boltzmann measure is

    [tex]
    dp(x)\sim e^{-\frac{m\|x\|^2}{2k_{\textrm{B}}T}} d\mu(x)
    [/tex]

    Why is this wrong? I calculated "the function" correctly, and then weighted "some measure" with "the correct function".

    How do we solve the correct background measure, which you will then weight with the function?

    I have found this topic to be very difficult and frustrating. Everytime I have attempted to ask about the background measure, people change the topic to the function. Even professors. Like I explain carefully that I have understood where [itex]e^{-E/(k_{\textrm{B}}T)}[/itex] comes from, but I have not understood where [itex]\mu[/itex] comes from. Then people stare at me as if I was dumb and respond "the derivation of [itex]e^{-E/(k_{\textrm{B}}T)}[/itex] was explained right there!". Apparently physicists don't like questions to which they don't know answers?
     
  2. jcsd
  3. Jun 13, 2013 #2

    atyy

    User Avatar
    Science Advisor

    Last edited: Jun 13, 2013
  4. Jun 15, 2013 #3
    Many books derive the Maxwell-Boltzmann-distribution by discretizing the velocity or momentum space. The discretizations is done so that the points are uniformly distributed, like [itex](\Delta p) \mathbb{Z}^3\subset\mathbb{R}^3[/itex] with some small [itex]\Delta p\in\mathbb{R}[/itex]. This uniform discretization is somewhat equivalent with the Lebesgue-measure choice.

    But this is not the only way to discretize the three dimensional space. It is also possible to discretize it so that the points are more dense in some regions. Then we would get a different Boltzmann-measure in the end. So my question is equivalent to the question that how do you know the correct discretization?

    Now it is a simple thing to decide strongly that "it is a Lebesgue measure, believe!", if we know it be true in advance. Well what if we are dealing with some problem to which we don't know the answer in advance? For example, I once asked (I don't bother digging the old thread now), what happens if you have a large collection of harmonic oscillators some how interacting. Nobody was able to tell an answer, and I believe it is because nobody knows the background measure in this case. We also don't know the final answer, so we can't solve the background measure backwards, and pretend that it had been obvious from the start.
     
  5. Jun 16, 2013 #4

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    The most straightforward way to understand classical statistics is to rely on quantum theory :-).

    Start with free particles and separate out a large cubic box (quantization volume) assuming periodic boundary conditions. Then you count the number of states in a given phase-space volume [itex]L^3 \mathrm{d}^3 \vec{p}[/itex]. It's given by [tex]L^3 \frac{\mathrm{d^3 \vec{p}}}{(2 \pi \hbar)^3}.[/tex]
    Thus the correct measure in classical phase space is in fact uniform in phase-space:
    [tex]\frac{\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3}.[/tex]
    This implies that the entropy (as a functional of the one-particle phase-space distribution), in the classical (i.e., neglecting the restrictions from Bose or Fermi statistics of indentical particle), reads
    [tex]S[f]=-k_{\text{B}} \int \frac{\mathrm{d}^3 \vec{x} \mathrm{d}^3 \vec{p}}{(2 \pi \hbar)^3} f(\vec{x},\vec{p}) \left [\ln[(2 \pi \hbar)^3 f(\vec{x},\vec{p})]-1 \right ].[/tex]
    For details, see Landau Lifgarbagez, vol. V.
     
  6. Jun 17, 2013 #5

    atyy

    User Avatar
    Science Advisor

    It's the same answer for interacting oscillators - as long as one knows what the canonically conjugate coordinates are. See vanhees71's post for the derivation from quantum mechanics. To some extent, one can insist on a purely classical view (not totally, because of the Gibbs paradox), in which case one notes that the entropy is not invariant under arbitrary changes of coordinates, so one should specify coordinates. When one restricts to canonical coordinates, the determinant of the Jacobian is 1, which ensures the formula is covariant under canonical changes of coordinates. So the basic idea is that one should be able to do the "same physics" in any choice of canonical coordinates.
     
  7. Jun 18, 2013 #6
    I have not checked this myself, and don't have a reference, but I'm under belief that if you write a computer simulation, where classical particles fly around in a box mostly freely, and occasionally interacting, the particles will eventually obey the Maxwell-Boltzmann distribution.

    Mechanisms behind physical reality are a mystery, but if a computer simulation has been written so that it is classical, then we know that the result cannot be coming from the quantum mechanics. This is why I would like to study classical theory as such.

    When I originally years ago explained my oscillator question, I meant that the oscillation velocities in the individual oscillators are somehow greater than the rate at witch the oscillators interact with each other.

    So it doesn't make sense to describe the states of individual oscillators with position and momentum parameters, because they keep changing constantly. Instead the energy of an individual oscillator will remain constant, while the oscillator remains on its orbit, until the oscillator interacts with other oscillators. So IMO the most natural "index" to describe the state of an oscillator is the energy itself. So the index set is [itex][0,\infty[[/itex], and its elements can be denoted as [itex]E[/itex].

    The Boltzmann distribution for the state of an individual oscillator will be

    [tex]
    dp(E) \sim e^{-\frac{E}{k_{\textrm{B}}T}} d\mu(E)
    [/tex]

    with some [itex]\mu[/itex]. The uniform distribution [itex]\mu([a,b])=b-a[/itex] for all [itex]0\leq a < b[/itex] is the most obvious choice. Is that correct in the end?

    The orbits with higher energies are longer. Does that mean that we should give more weight for higher energies?
     
  8. Jun 18, 2013 #7

    vanhees71

    User Avatar
    Science Advisor
    2016 Award

    No, it's not correct. As I pointed out earlier, the uniform a priori probabilities are for the phase-space variables as inferred from quantum theory. For a free gas the energy-momentum relation for each non-relativistic particle is
    [tex]E=\frac{\vec{p}^2}{2m}=\frac{P^2}{2m}.[/tex]
    The correct measure is
    [tex]\mathrm{d}^3 \vec{p}=4 \pi P^2 \mathrm{d} P.[/tex]
    Now
    [tex]\mathrm{d} E=\mathrm{d} P \frac{P}{m},[/tex]
    i.e.,
    [tex]\mathrm{d}^3 \vec{p}=4 \pi \mathrm{d} E m P=4 \pi \mathrm{d} E m \sqrt{2mE}.[/tex]
     
  9. Jun 18, 2013 #8

    atyy

    User Avatar
    Science Advisor

    Would you be able to define the system's Hamiltonian, and its canonically conjugate variables?

    Incidentally, not all deterministic systems (classical or quantum) that are described by a Hamiltonian will "thermalize" and reach the Maxwell-Boltzmann distribution or whatever statistical mechanics prescribes. In particular, "integrable" systems with enough constants of motion will never thermalize. I don't know how far from integrability one has to go to get a deterministic system ending up in a random looking "thermal state".

    In the classical case, one conjecture is that the system has to be chaotic. There's a discussion of this in http://arxiv.org/abs/0807.1268 .

    In the quantum case, some interesting discussions are http://arxiv.org/abs/1007.3957 and http://arxiv.org/abs/1108.0928 .

    It's not enough to include an interaction. A famous case in which a nonlinearity, added to represent interactions between different modes, was insufficient for making the system end up in a "random" state is the Fermi-Pasta-Ulam-Tsingou simulation: http://en.wikipedia.org/wiki/Fermi–Pasta–Ulam_problem .
     
    Last edited: Jun 18, 2013
  10. Jun 18, 2013 #9
    I forgot to specify the dimension of the oscillators. Your formulas are for three dimensional? Was my guess correct for one dimensional?

    The enegies [itex]\hbar\omega(n + \frac{1}{2})[/itex] of quantum states would seem to hint towards uniformity.
     
  11. Jun 18, 2013 #10
    In the beginning I could define the whole system as a large collection of independent harmonic oscillators. But then I get rid of the position and mometum variables, and simplify the description so that only the index [itex]E[/itex] remains for each oscillator. This should be ok, since for example, in the Ising model too, we don't have positions or momentums, only indices for different states.
     
  12. Jun 18, 2013 #11

    atyy

    User Avatar
    Science Advisor

    Spin is a quantum mechanical variable, so in these cases the most common practice is to consider the quantum case, then take the classical limit as vanhees71 recommends. In the Ising case, there is no measure problem, since everything is discrete. But the most common measure for the classical xy model, which is continuous, comes from considering an appropriate limit of the quantum case.

    Alternatively, if one really wishes to define a classical "statistical mechanics" problem without a Hamiltonian, then one must of course specify the full partition function, including the measure in order to have a defined problem in the first place, as you indicated. If the problem is not in Hamiltonian form with conjugate canonical variables, then there are also no dynamics without further specification.

    Going quite far from physics, there is the concept of a Gibbs measure, which can be defined under certain sorts of stochastic dynamics, or as the maximum entropy distribution given certain constraints (http://arxiv.org/abs/1302.5007). For example the usual Ising model distribution can be defined as the maximum entropy distribution given certain expectation values (p11 of http://arxiv.org/abs/1302.5007).
     
    Last edited: Jun 18, 2013
  13. Jun 18, 2013 #12
    A Hamiltonian for free particles in a box is

    [tex]
    H(x_1,x_2,\ldots, p_1,p_2,\ldots) = \sum_{n=1}^N H_n(x_n,p_n)
    [/tex]

    [tex]
    H_n(x_n,p_n) = \left\{\begin{array}{ll}
    +\infty,\quad\quad & x_n\notin [-R,R]^3\\
    \frac{\|p_n\|^2}{2m},\quad\quad &x_n\in [-R,R]^3\\
    \end{array}\right.
    [/tex]

    Now this Hamiltonian doesn't imply any statistical distribution, since the particles don't interact, but the Maxwell-Boltzmann distribution is usually derived with the discretization technique without discussing the actual interactions. So it seems that the actual form of interactions doesn't matter very much.

    I was hoping that perhaps the oscillator problem too could be solved without a need to specify the interactions.

    If some interaction is added to the Hamiltonian of particles in the box, is it possible to use the given interaction in some relevant way?
     
  14. Jun 18, 2013 #13

    atyy

    User Avatar
    Science Advisor


    Your idea is that the interactions plus deterministic dynamics are what lead to the Maxwell-Boltzmann distribution, and that this will be the case no matter what the interactions are?
     
  15. Jun 18, 2013 #14
    I believe that the collection of different interactions which lead to the same statistics is large.
     
  16. Jun 18, 2013 #15

    atyy

    User Avatar
    Science Advisor

    I think it's very tricky to derive the statistical mechanics distributions from deterministic dynamics.

    In statistical mechanics, we can write thermal distributions even for interacting integrable systems.

    However, we know from the dynamics of these systems that the statistical mechanics prediction is false. This is mathematically true, and can be demonstrated in real physical systems, eg. http://lib.semi.ac.cn:8080/tsh/dzzy/wsqk/Nature/vol440/440-900.pdf .

    Since statistical mechanics is a theory with many successful predictions, it seems that most systems with large number of particles that we encounter in real life cannot be integrable. But what is the exact mathematical condition for "not integrable"? I'm not sure the answer is known, and can only point you to the references in post #8.
     
  17. Jun 19, 2013 #16

    atyy

    User Avatar
    Science Advisor

    @jostpuur: Googling around, I found some papers that you may find interesting. They address classical models of interacting oscillators placed on lattices, and see if the deterministic (chaotic) Hamiltonian dynamics causes the system to end up looking as predicted by the microcanonical ensemble. In the paper on the xy model, they have to add a term to the usual Hamiltonian used in statistical mechanics in order to get deterministic Hamiltonian dynamics.

    There's a comment in the review by Casetti: "This is similar to what happens in the classical statistical mechanics of Hamiltonian systems of the form (1), where the momenta can be integrated out and the statistical measure can be defined on the configuration space alone. We remark that this is true for both the microcanonical and the canonical ensemble."

    So even when the momenta are not explicitly stated, one can still consider the background measure as Lebesgue measure over canonical coordinates.

    Caiani et al, Geometry of dynamics and phase transitions in classical lattice phi^4 theories
    Leoncini et al, Hamiltonian Dynamics and the Phase Transition of the XY Model
    Casetti et al, Geometric approach to Hamiltonian dynamics and statistical mechanics
     
    Last edited: Jun 19, 2013
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: The background measure in Boltzmann measure
  1. Measuring Force! (Replies: 6)

  2. Measuring torque (Replies: 6)

  3. Velocity measurement (Replies: 5)

  4. Measurement of time (Replies: 2)

Loading...