Does a probability distribution correctly describe entropy?

AI Thread Summary
The discussion centers on the inadequacy of the statistical mechanics explanation of entropy, particularly regarding how probability distributions relate to fluctuations in systems like gas molecules. Participants express concern that organized arrangements can be misrepresented as random fluctuations, questioning the existence of significant fluctuations that could lead to new potential for work. There is a call for clarity on how probability distributions change as systems deviate from equilibrium, especially in non-equilibrium states. The conversation highlights the challenge of defining entropy for non-equilibrium systems and the need for a coherent mathematical framework to describe these fluctuations. Overall, the thread seeks a deeper understanding of entropy's relationship with probability in fluctuating systems.
Charlie313
Messages
6
Reaction score
0
The colloquial statistical mechanics explanation of entropy as if it is caused by probability is dissatisfying to me, in part because it allows highly organized (i.e. with a real potential for work) arrangements to appear as 'random fluctuations', though with very low probability. But as far as I know (not a physicist!) we don't even see tiny, less improbable but still significant fluctuations toward 'new' potential for work, much less the big, super-improbable ones. Is there a constraint on the fluctuations of 'random' systems like gas molecules in a box that would not appear if we simply add the probabilities at equilibrium?

Another way of asking the same question: are there experimentally supported equations for how the probability distribution for a volume of gas or other entropically constrained system changes as the system begins to fluctuate away from maximum (i.e. equilibrium) entropy and toward some significant potential for work? My dissatisfaction with the statistical 'explanation' is in part because the arrangements of molecules in a box of gas are self-interacting, so that any shift in a counter-entropic direction, and toward ‘free’ work, should (at least to my layman's thinking) change the probability distributions in nonlinear ways that might reduce 'very highly improbable' to zero probability. Mathematical answers are welcome (‘are there equations?’), but I am a visual and intuitive not a mathematical thinker so translations into non-math or intuitive concepts would be greatly appreciated. Thanks!
 
Science news on Phys.org
Charlie313 said:
But as far as I know (not a physicist!) we don't even see tiny, less improbable but still significant fluctuations toward 'new' potential for work
We see fluctuations as large as expected with the limited number of observations.

There is a difference between seeing a 1 in a billion event (easy if you look every nanosecond) and a 1 in 101010 event, but there is no practical difference between 1 in 101010 and 101020 - we won't see either.
 
Thanks mfb for the response and that's a good point. (playing the license plate game w/ mfb, I got mondo-freaking brilliant, or something to that effect :D) -- Still wondering if someone can point me toward discussion of how the prob distributions change for gas in a box as it hypothetically fluctuates away from equilibrium and toward the 'very small but real' arrangement that noticeably reduces entropy and increases potential--or even in a non-equilibrium arrangement, like hot on one side, evolving toward equilibrium; how 'smooth' is the curve of decreasing or increasing entropy and how does it change as we move the system toward or away from equilibrium? Of course it will have molecule-level fluctuations, since molecules are the fundamental unit in which the system's randomness and probability structure is defined. And there will be little coincidences where, say, little groups or waves of hot molecules move toward the hot side, temporarily 'reducing entropy' on a very local scale, but not for the whole system. What I am trying to figure out (in my somewhat impaired way) is what effect a larger-scale movement toward one of those very rare (1/10^10^10 or rarer) large-scale fluctuations toward lower entropy that would allow 'new' work to be gotten out of the system would have on the probability distribution itself. I think the M-B distribution only applies at equilibrium; what formalism if any describes the changing distribution as the system as a whole is moving toward or (very improbably) away from equilibrium? (trying to think of search phrases that might catch that). Thanks again!
 
Last edited:
To get all atoms at the same side of the room, you don't need a deviation from MB. In the limit of an ideal gas, all the molecules are collision-free (or at least with collision timescales longer than relevant, with collisions only at the walls), they will be in each half of the room with 50% probability, independent of their velocity. 10 atoms give you a 1/512 probability to have all at the same side, 20 atoms lead to a 1/500,000 probability, 100 atoms to 1 in 0.5*1030, and so on. With 20 atoms that is something you can wait for, with 100 atoms it is not, and with 1030 atoms it just doesn't happen, although there is a non-zero chance.

Even a different temperature doesn't need a deviation from MB. You just need the faster atoms at one side by chance.

A deviation from MB is possible as well, but different from the two scenarios above.
 
Charlie313 said:
Still wondering if someone can point me toward discussion of how the prob distributions change for gas in a box as it hypothetically fluctuates away from equilibrium and toward the 'very small but real' arrangement that noticeably reduces entropy and increases potential

If you are thinking of a gas as collection of particles, each of which has a definite position and velocity at a given time, there is no probability involved and it has no defined entropy. It's like thinking of a "fair coin" that has already been tossed and definitely landed heads. That's why statistical mechanics is forced to use the tortuous language of "ensembles" of systems.

If we want to talk about "fluctuations", we must specify exactly what is fluctuating. Trying to speak of "probability" as a general abstraction is not mathematically coherent. One must specify what events are in the probability space (i.e. "probability" must be the probability of some set of events.) So how would you formulate your question so it has a clear meaning? What probability are you asking about?

--or even in a non-equilibrium arrangement, like hot on one side, evolving toward equilibrium; how 'smooth' is the curve of decreasing or increasing entropy and how does it change as we move the system toward or away from equilibrium?

Thermodynamic entropy is not defined for (an ensemble of) gases that are not in equilibrium. So the first problem would be to invent a definition for entropy in non-equilibrium situations.
 
I need to calculate the amount of water condensed from a DX cooling coil per hour given the size of the expansion coil (the total condensing surface area), the incoming air temperature, the amount of air flow from the fan, the BTU capacity of the compressor and the incoming air humidity. There are lots of condenser calculators around but they all need the air flow and incoming and outgoing humidity and then give a total volume of condensed water but I need more than that. The size of the...
Thread 'Why work is PdV and not (P+dP)dV in an isothermal process?'
Let's say we have a cylinder of volume V1 with a frictionless movable piston and some gas trapped inside with pressure P1 and temperature T1. On top of the piston lay some small pebbles that add weight and essentially create the pressure P1. Also the system is inside a reservoir of water that keeps its temperature constant at T1. The system is in equilibrium at V1, P1, T1. Now let's say i put another very small pebble on top of the piston (0,00001kg) and after some seconds the system...
I was watching a Khan Academy video on entropy called: Reconciling thermodynamic and state definitions of entropy. So in the video it says: Let's say I have a container. And in that container, I have gas particles and they're bouncing around like gas particles tend to do, creating some pressure on the container of a certain volume. And let's say I have n particles. Now, each of these particles could be in x different states. Now, if each of them can be in x different states, how many total...
Back
Top