- #1
gsingh2011
- 115
- 1
I'm trying to relate an analogy from Brian Greene about entropy microstates/macrostates to the real world. In the analogy, you have 100 coins that you flip. The microstate is which particular coins landed heads up. The macrostate is the total number of coins that are heads up. So a low entropy configuration would be when all the coins are heads up. There is only one microstate that corresponds to that macrostate. But there is a very large number of microstates that correspond to the macrostate where 50 coins are heads up, and that is a high entropy configuration.
So relating this to a gas in a box, let me know if this understanding is correct. If all of the particles of a gas in a box are contained within a small cubic region in the corner of the box, an external observer would measure a particular pressure and temperature for the gas. And while there may be other configurations (configuration means positions/velocities of each particle) of the gas in the box that would get those same pressure/temperature readings, there aren't that many of them. However, if the gas is spread out throughout the box, and you measure the pressure/temperature, there are a large number of other configurations of the particles that result in the same reading.
Is that a correct interpretation of entropy?
So relating this to a gas in a box, let me know if this understanding is correct. If all of the particles of a gas in a box are contained within a small cubic region in the corner of the box, an external observer would measure a particular pressure and temperature for the gas. And while there may be other configurations (configuration means positions/velocities of each particle) of the gas in the box that would get those same pressure/temperature readings, there aren't that many of them. However, if the gas is spread out throughout the box, and you measure the pressure/temperature, there are a large number of other configurations of the particles that result in the same reading.
Is that a correct interpretation of entropy?