# Microstates - statistical mechanics

1. Feb 17, 2006

### allanm1

I understand that a microstate is one possible distribution of energies that make up the system's total energy. But I cant understand why, in a monatomic gas for example (where there is only translational kinetic energy of atoms), there is a finite number of states. Surely that would mean that the atoms can only travel at discrete velocities (ie discrete kinetic energys).
If the atoms in a gas are moving all about and accelerating and decelerating
how can that be?

Can someone point me towards a good (web) reference.

Cheers.

2. Feb 18, 2006

### mma

http://en.wikipedia.org/wiki/Particle_in_a_box" [Broken]

Last edited by a moderator: May 2, 2017
3. Feb 18, 2006

### allanm1

OK. But how is this applied to a monatomic gas (with many particles).

What is the mechanism by which one atom can only transfer energy to another in discrete amounts? Classically, surely it could transfer any fraction of its energy to another in a collision, giving rise to a smooth distribution of possible energies for each atom.

Why is it discrete? I am finding it hard to visualise.

4. Feb 18, 2006

### mma

I'm afraid that we can't describe the mechanism of the energy transfer between the atoms. QM is capable only to determine the possible stationary states, and the probabilities of the transitions from one stationary state to the other.

5. Feb 18, 2006

### allanm1

Do you know an equivalent reference of "particle in the box" for a many particle system (monatomic gas)?

6. Feb 18, 2006

### mma

Yes, of course
http://en.wikipedia.org/wiki/Gas_in_a_box" [Broken]

Last edited by a moderator: May 2, 2017
7. Feb 18, 2006

### Galileo

You have to do some discretization (also in the classical ideal gas) to label a microstate. The cut your phase-space volume into tiny boxes. since your box is bounded, you'll have a finite (but very large) number of them. Your 'velocity space' is still infinite, but because a constraint of the system is that the total energy must be E (thus holds for each microstate) you'll only have a finite number of choices.

8. Feb 18, 2006

### mma

I think that the notion of microstate isn't appropriate in the classical model. I know, for example Landau uses the method you mentioned, but it is really unnecessary. Entropy of a given probability density $$f(x)$$ can be defined simply by $$\int_{X}f(x)logf(x) dx$$
(see e.g. http://www.cnd.mcgill.ca/bios/mackey/pdf_pub/dynamic_1989.pdf" [Broken] WARNING: This is a 7 Mb pdf file)

Last edited by a moderator: May 2, 2017
9. Feb 18, 2006

### Physics Monkey

Hi mma,

I think one does have to be a little careful defining the entropy of a continuous random variable. Part of the trouble with just defining $$S = - \int f(x) \,\ln{ f(x) } \, dx$$ is that it isn't reparameterization invariant. Imagine, for example, a Gaussian distribution $$f_x(x)$$ and consider rescaling the random variable $$y = a x$$. The probability distribution for y is $$f_y(y) = f_x(y/a) / a$$ so that $$\int f_y(y) \, dy = \int f_x(x) \, dx= 1$$. However, the entropy associated with $$f_y$$ is different from the entropy one calculates with $$f_x$$. If I think about about a physical system now, I seem to be lead to the uncomfortable conclusion that the entropy can be anything I want.

This highlights the special role played by phase space in classical statistical mechanics. In the Hamiltonian formulation the allowed coordinate transformations are the so called canonical transformations which have the property that they preserve the volume element of phase space. It is really this special invariance that allows you to define a meaningful entropy for continuous probability distributions over phase space. Ultimately, Planck's constant sets the scale for phase space volume, but classical discretization schemes let one avoid these subtleties associated with continuous distributions.

Last edited: Feb 18, 2006
10. Feb 19, 2006

### mma

Yes. The entropy defined by $$f_y(y)$$ is $$S_y = - \int f_y(y) \,\ln{ f_y(y) } \, dy = - \int f(x) \,\ln{( f(x)/a )} \, dx = S_x + a$$. That is, the two entropies differ only by an additive constant. But it seems to me that this difference has no physical significance, because only the differential of entropy has any role in thermodynamics, its absolute value not.
It is very common, that a physical quantity is determined only up to an additive or multiplicative constant, e.g. the potentials are also such quantities.

Last edited: Feb 19, 2006