Recognitions:
Gold Member

## Counting the number of configurations (Entropy)

Hi all,

Entropy uses the 6N dimensional phase space. But ...
Roger Balian in "Scientific American" takes one liter gas in a cube and he writes:
I can replace the continuous volume by Q = 10^100 sites after having
evacuated the speeds (he says this is possible with quantum mechanics)
He then counts the number of possible places for the N molecules and get the entropy.

How can he do that?

 PhysOrg.com physics news on PhysOrg.com >> Promising doped zirconia>> New X-ray method shows how frog embryos could help thwart disease>> Bringing life into focus
 I guess he is considering very fast processes in which the modifications on the wall of this cube does not introduce modifications in the pattern of velocities, just the number of positions available will change, so, in order to evaluate the entropy variation one has only to monitor the number of available positions. I am not sure. Best wishes DaTario
 Recognitions: Gold Member Surprisingly, the distance between 2 sites is like Planck's length!. Is it really correct to ignore speeds in that counting?

## Counting the number of configurations (Entropy)

IMO, if you are doing the compression with high velocity compared with the mean velocity of the particles in the gas and by small steps, it seems one can well defend this procedure.

Best wishes

DaTario

 Quote by naima Hi all, Entropy uses the 6N dimensional phase space. But ... Roger Balian in "Scientific American" takes one liter gas in a cube and he writes: I can replace the continuous volume by Q = 10^100 sites after having evacuated the speeds (he says this is possible with quantum mechanics) He then counts the number of possible places for the N molecules and get the entropy. How can he do that?
Is the article you're referring to available online? There is a general principle in statistical mechanics that you can get the number of quantum states in a given energy range by calculating the classical phase space volume and dividing by h^M, where h is Planck's constant, M is the number of degrees of freedom (3N for N atoms in a monatomic gas); perhaps that's what he's talking about?

 Tags entropy