Microstates - statistical mechanics

AI Thread Summary
Microstates represent specific distributions of energy within a system, and in the context of a monatomic gas, the finite number of states arises from the discretization of phase space, despite the infinite velocity potential of the atoms. This discretization is necessary to label microstates, as the total energy constraint limits the choices available. The discussion highlights the challenge in visualizing energy transfer between atoms, which is governed by quantum mechanics and results in discrete energy exchanges rather than continuous ones. Concerns are raised about defining entropy for continuous variables, emphasizing the importance of phase space in classical statistical mechanics. Ultimately, the conversation underscores the complexities of applying classical models to statistical mechanics and the role of quantum mechanics in understanding energy distributions.
allanm1
Messages
6
Reaction score
0
I understand that a microstate is one possible distribution of energies that make up the system's total energy. But I can't understand why, in a monatomic gas for example (where there is only translational kinetic energy of atoms), there is a finite number of states. Surely that would mean that the atoms can only travel at discrete velocities (ie discrete kinetic energys).
If the atoms in a gas are moving all about and accelerating and decelerating
how can that be?

Can someone point me towards a good (web) reference.

Cheers.
 
Physics news on Phys.org
http://en.wikipedia.org/wiki/Particle_in_a_box"
 
Last edited by a moderator:
OK. But how is this applied to a monatomic gas (with many particles).

What is the mechanism by which one atom can only transfer energy to another in discrete amounts? Classically, surely it could transfer any fraction of its energy to another in a collision, giving rise to a smooth distribution of possible energies for each atom.

Why is it discrete? I am finding it hard to visualise.
 
allanm1 said:
What is the mechanism by which one atom can only transfer energy to another in discrete amounts?

I'm afraid that we can't describe the mechanism of the energy transfer between the atoms. QM is capable only to determine the possible stationary states, and the probabilities of the transitions from one stationary state to the other.
 
Do you know an equivalent reference of "particle in the box" for a many particle system (monatomic gas)?
 
allanm1 said:
Do you know an equivalent reference of "particle in the box" for a many particle system (monatomic gas)?

Yes, of course
http://en.wikipedia.org/wiki/Gas_in_a_box"
:smile:
 
Last edited by a moderator:
allanm1 said:
I understand that a microstate is one possible distribution of energies that make up the system's total energy. But I can't understand why, in a monatomic gas for example (where there is only translational kinetic energy of atoms), there is a finite number of states.
You have to do some discretization (also in the classical ideal gas) to label a microstate. The cut your phase-space volume into tiny boxes. since your box is bounded, you'll have a finite (but very large) number of them. Your 'velocity space' is still infinite, but because a constraint of the system is that the total energy must be E (thus holds for each microstate) you'll only have a finite number of choices.
 
Galileo said:
You have to do some discretization (also in the classical ideal gas) to label a microstate. The cut your phase-space volume into tiny boxes. since your box is bounded, you'll have a finite (but very large) number of them. Your 'velocity space' is still infinite, but because a constraint of the system is that the total energy must be E (thus holds for each microstate) you'll only have a finite number of choices.

I think that the notion of microstate isn't appropriate in the classical model. I know, for example Landau uses the method you mentioned, but it is really unnecessary. Entropy of a given probability density f(x) can be defined simply by \int_{X}f(x)logf(x) dx
(see e.g. http://www.cnd.mcgill.ca/bios/mackey/pdf_pub/dynamic_1989.pdf" WARNING: This is a 7 Mb pdf file)
 
Last edited by a moderator:
Hi mma,

I think one does have to be a little careful defining the entropy of a continuous random variable. Part of the trouble with just defining S = - \int f(x) \,\ln{ f(x) } \, dx is that it isn't reparameterization invariant. Imagine, for example, a Gaussian distribution f_x(x) and consider rescaling the random variable y = a x. The probability distribution for y is f_y(y) = f_x(y/a) / a so that \int f_y(y) \, dy = \int f_x(x) \, dx= 1. However, the entropy associated with f_y is different from the entropy one calculates with f_x. If I think about about a physical system now, I seem to be lead to the uncomfortable conclusion that the entropy can be anything I want.

This highlights the special role played by phase space in classical statistical mechanics. In the Hamiltonian formulation the allowed coordinate transformations are the so called canonical transformations which have the property that they preserve the volume element of phase space. It is really this special invariance that allows you to define a meaningful entropy for continuous probability distributions over phase space. Ultimately, Planck's constant sets the scale for phase space volume, but classical discretization schemes let one avoid these subtleties associated with continuous distributions.
 
Last edited:
  • #10
Physics Monkey said:
Hi mma,

I think one does have to be a little careful defining the entropy of a continuous random variable. Part of the trouble with just defining S = - \int f(x) \,\ln{ f(x) } \, dx is that it isn't reparameterization invariant. Imagine, for example, a Gaussian distribution f_x(x) and consider rescaling the random variable y = a x. The probability distribution for y is f_y(y) = f_x(y/a) / a so that \int f_y(y) \, dy = \int f_x(x) \, dx= 1. However, the entropy associated with f_y is different from the entropy one calculates with f_x.

Yes. The entropy defined by f_y(y) is S_y = - \int f_y(y) \,\ln{ f_y(y) } \, dy = - \int f(x) \,\ln{( f(x)/a )} \, dx = S_x + a. That is, the two entropies differ only by an additive constant. But it seems to me that this difference has no physical significance, because only the differential of entropy has any role in thermodynamics, its absolute value not.
It is very common, that a physical quantity is determined only up to an additive or multiplicative constant, e.g. the potentials are also such quantities.
 
Last edited:
Back
Top