Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Microstates - statistical mechanics

  1. Feb 17, 2006 #1
    I understand that a microstate is one possible distribution of energies that make up the system's total energy. But I cant understand why, in a monatomic gas for example (where there is only translational kinetic energy of atoms), there is a finite number of states. Surely that would mean that the atoms can only travel at discrete velocities (ie discrete kinetic energys).
    If the atoms in a gas are moving all about and accelerating and decelerating
    how can that be?

    Can someone point me towards a good (web) reference.

    Cheers.
     
  2. jcsd
  3. Feb 18, 2006 #2

    mma

    User Avatar

  4. Feb 18, 2006 #3
    OK. But how is this applied to a monatomic gas (with many particles).

    What is the mechanism by which one atom can only transfer energy to another in discrete amounts? Classically, surely it could transfer any fraction of its energy to another in a collision, giving rise to a smooth distribution of possible energies for each atom.

    Why is it discrete? I am finding it hard to visualise.
     
  5. Feb 18, 2006 #4

    mma

    User Avatar

    I'm afraid that we can't describe the mechanism of the energy transfer between the atoms. QM is capable only to determine the possible stationary states, and the probabilities of the transitions from one stationary state to the other.
     
  6. Feb 18, 2006 #5
    Do you know an equivalent reference of "particle in the box" for a many particle system (monatomic gas)?
     
  7. Feb 18, 2006 #6

    mma

    User Avatar

  8. Feb 18, 2006 #7

    Galileo

    User Avatar
    Science Advisor
    Homework Helper

    You have to do some discretization (also in the classical ideal gas) to label a microstate. The cut your phase-space volume into tiny boxes. since your box is bounded, you'll have a finite (but very large) number of them. Your 'velocity space' is still infinite, but because a constraint of the system is that the total energy must be E (thus holds for each microstate) you'll only have a finite number of choices.
     
  9. Feb 18, 2006 #8

    mma

    User Avatar

    I think that the notion of microstate isn't appropriate in the classical model. I know, for example Landau uses the method you mentioned, but it is really unnecessary. Entropy of a given probability density [tex]f(x)[/tex] can be defined simply by [tex]\int_{X}f(x)logf(x) dx[/tex]
    (see e.g. M.C. Mackey. "The dynamic origin of increasing entropy", Rev. Mod. Phys. (1989) 61, 981-1016. WARNING: This is a 7 Mb pdf file)
     
    Last edited: Feb 18, 2006
  10. Feb 18, 2006 #9

    Physics Monkey

    User Avatar
    Science Advisor
    Homework Helper

    Hi mma,

    I think one does have to be a little careful defining the entropy of a continuous random variable. Part of the trouble with just defining [tex] S = - \int f(x) \,\ln{ f(x) } \, dx [/tex] is that it isn't reparameterization invariant. Imagine, for example, a Gaussian distribution [tex] f_x(x) [/tex] and consider rescaling the random variable [tex] y = a x [/tex]. The probability distribution for y is [tex] f_y(y) = f_x(y/a) / a [/tex] so that [tex] \int f_y(y) \, dy = \int f_x(x) \, dx= 1 [/tex]. However, the entropy associated with [tex] f_y [/tex] is different from the entropy one calculates with [tex] f_x [/tex]. If I think about about a physical system now, I seem to be lead to the uncomfortable conclusion that the entropy can be anything I want.

    This highlights the special role played by phase space in classical statistical mechanics. In the Hamiltonian formulation the allowed coordinate transformations are the so called canonical transformations which have the property that they preserve the volume element of phase space. It is really this special invariance that allows you to define a meaningful entropy for continuous probability distributions over phase space. Ultimately, Planck's constant sets the scale for phase space volume, but classical discretization schemes let one avoid these subtleties associated with continuous distributions.
     
    Last edited: Feb 18, 2006
  11. Feb 19, 2006 #10

    mma

    User Avatar

    Yes. The entropy defined by [tex] f_y(y) [/tex] is [tex]S_y = - \int f_y(y) \,\ln{ f_y(y) } \, dy = - \int f(x) \,\ln{( f(x)/a )} \, dx = S_x + a[/tex]. That is, the two entropies differ only by an additive constant. But it seems to me that this difference has no physical significance, because only the differential of entropy has any role in thermodynamics, its absolute value not.
    It is very common, that a physical quantity is determined only up to an additive or multiplicative constant, e.g. the potentials are also such quantities.
     
    Last edited: Feb 19, 2006
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?