Are the number of microstates of a gas just equivalent to pressure

  1. I am quite confused about this area.

    First entropy does not contain any reference to volume. So if we can theoretically set the entropy of A and B gas samples as the same but in different volumes. If A is in a larger volume it would be able to exhibit a larger number of microstates? Yet the Boltzman equation gives the same result for both as it also ignores volume.

    I would also be interested to know if the concept of microstates is actually at all useful, or is it just a bystander in real world physics.

  2. jcsd
  3. mfb

    Staff: Mentor

    If both samples have the same type and number of molecules, their temperature will be different, which changes the number of available microstates as well.

    Depends on the system. If you consider systems where that number is easy to calculate (some spins, or whatever), it can be quite useful.
  4. Sorry still confused. Lets say we have 100 molecules of gas in each of the differently sized boxes both with the same average molecular velocity I would have thought this would give the same temperature reading as interactions with a thermometer would be the same, though less frequent in the larger box, so it would give the same read out.

    So E, S and T are all the same, but Boltzman states should be different.

    (Lets assume radiatively reflective housing to eliminate infrared heat loss.)
  5. mfb

    Staff: Mentor

    But then you get a different number of microstates, and a different entropy.
  6. Yes. You have different volumes, same temperature, same number of particles, so entropy is not the same. Also, what do you mean by "Boltzmann states"?
  7. Sorry for the basic confusion there. I has always imagined microstates visually - making the volume and locations of the particles an aspect of the calculations, whereas it is actually just about the distribution of energies according to one source I have just read and not about the volume. And it follows that this is true from:

    Entropy = energy over temperature - nothing to do with volume
    Entropy = k log.w - nothing to do with volume either

    (Though not 100% sure how W is assessed)

    (If we are being finicky about it, in reality, the volume does make a difference due to gravitation reducing the maximum energy probable away from the earth.)

    An important exception is if we vapourise a few hunderd molecules of gold - there is the likelihood of, on average, a nice normal distribution of energies in the atoms in a confined space, however if we distribute these molecules in a sufficiently large space wherein they will not collide evidently they will retain their initial energies. So the law of entropy breaks down here.
  8. I have just been thinking about this a little more and it turns out that volume does make a difference to the number of available microstates as multiatom collisions become less likely in a less dense substance wherein the momentum of two atoms in vector x might add up to produce a high value otherwise rarely produced (?).
  9. mfb

    Staff: Mentor

    I have no idea what you mean here. You don't have to consider any collisions here.

    More volume -> more states for the particles at same energy -> more microstates for a given temperature.
    It is as simple as that.

    They are related via derivatives, you do not get an absolute entropy value here.

    Volume influences w.
  10. Brilliant; thanks; that is a lot clearer now.
  11. A good "reality check" when thinking of entropy is the Sackur-Tetrode equation for the entropy of an ideal gas: [tex]S=kN\ln\left[\left(\frac{V}{N}\right)\left(\frac{Um}{N}\right)^c\phi\right][/tex] where V is volume, N is number of particles, U is internal energy, m is mass per particle, [itex]\phi[/itex] is a universal constant, and c is dimensionless specific heat at constant volume (e.g. 3/2 for a monatomic gas). You can use PV=NkT and U=cNkT to see how entropy varies with other parameters. For the case we were discussing, we were thinking of entropy as a function of N, T, and V, so using U=cNkT, the entropy is [tex]S=kN\ln\left[\frac{V}{N}\left(mckT\right)^c\phi\right][/tex]
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?