1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Does a probability distribution correctly describe entropy?

  1. Mar 14, 2017 #1
    The colloquial statistical mechanics explanation of entropy as if it is caused by probability is dissatisfying to me, in part because it allows highly organized (i.e. with a real potential for work) arrangements to appear as 'random fluctuations', though with very low probability. But as far as I know (not a physicist!) we don't even see tiny, less improbable but still significant fluctuations toward 'new' potential for work, much less the big, super-improbable ones. Is there a constraint on the fluctuations of 'random' systems like gas molecules in a box that would not appear if we simply add the probabilities at equilibrium?

    Another way of asking the same question: are there experimentally supported equations for how the probability distribution for a volume of gas or other entropically constrained system changes as the system begins to fluctuate away from maximum (i.e. equilibrium) entropy and toward some significant potential for work? My dissatisfaction with the statistical 'explanation' is in part because the arrangements of molecules in a box of gas are self-interacting, so that any shift in a counter-entropic direction, and toward ‘free’ work, should (at least to my layman's thinking) change the probability distributions in nonlinear ways that might reduce 'very highly improbable' to zero probability. Mathematical answers are welcome (‘are there equations?’), but I am a visual and intuitive not a mathematical thinker so translations into non-math or intuitive concepts would be greatly appreciated. Thanks!
     
  2. jcsd
  3. Mar 14, 2017 #2

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    We see fluctuations as large as expected with the limited number of observations.

    There is a difference between seeing a 1 in a billion event (easy if you look every nanosecond) and a 1 in 101010 event, but there is no practical difference between 1 in 101010 and 101020 - we won't see either.
     
  4. Mar 15, 2017 #3
    Thanks mfb for the response and that's a good point. (playing the license plate game w/ mfb, I got mondo-freaking brilliant, or something to that effect :D) -- Still wondering if someone can point me toward discussion of how the prob distributions change for gas in a box as it hypothetically fluctuates away from equilibrium and toward the 'very small but real' arrangement that noticeably reduces entropy and increases potential--or even in a non-equilibrium arrangement, like hot on one side, evolving toward equilibrium; how 'smooth' is the curve of decreasing or increasing entropy and how does it change as we move the system toward or away from equilibrium? Of course it will have molecule-level fluctuations, since molecules are the fundamental unit in which the system's randomness and probability structure is defined. And there will be little coincidences where, say, little groups or waves of hot molecules move toward the hot side, temporarily 'reducing entropy' on a very local scale, but not for the whole system. What I am trying to figure out (in my somewhat impaired way) is what effect a larger-scale movement toward one of those very rare (1/10^10^10 or rarer) large-scale fluctuations toward lower entropy that would allow 'new' work to be gotten out of the system would have on the probability distribution itself. I think the M-B distribution only applies at equilibrium; what formalism if any describes the changing distribution as the system as a whole is moving toward or (very improbably) away from equilibrium? (trying to think of search phrases that might catch that). Thanks again!
     
    Last edited: Mar 15, 2017
  5. Mar 16, 2017 #4

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    To get all atoms at the same side of the room, you don't need a deviation from MB. In the limit of an ideal gas, all the molecules are collision-free (or at least with collision timescales longer than relevant, with collisions only at the walls), they will be in each half of the room with 50% probability, independent of their velocity. 10 atoms give you a 1/512 probability to have all at the same side, 20 atoms lead to a 1/500,000 probability, 100 atoms to 1 in 0.5*1030, and so on. With 20 atoms that is something you can wait for, with 100 atoms it is not, and with 1030 atoms it just doesn't happen, although there is a non-zero chance.

    Even a different temperature doesn't need a deviation from MB. You just need the faster atoms at one side by chance.

    A deviation from MB is possible as well, but different from the two scenarios above.
     
  6. Mar 16, 2017 #5

    Stephen Tashi

    User Avatar
    Science Advisor

    If you are thinking of a gas as collection of particles, each of which has a definite position and velocity at a given time, there is no probability involved and it has no defined entropy. It's like thinking of a "fair coin" that has already been tossed and definitely landed heads. That's why statistical mechanics is forced to use the tortuous language of "ensembles" of systems.

    If we want to talk about "fluctuations", we must specify exactly what is fluctuating. Trying to speak of "probability" as a general abstraction is not mathematically coherent. One must specify what events are in the probability space (i.e. "probability" must be the probability of some set of events.) So how would you formulate your question so it has a clear meaning? What probability are you asking about?

    Thermodynamic entropy is not defined for (an ensemble of) gases that are not in equilibrium. So the first problem would be to invent a definition for entropy in non-equilibrium situations.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Does a probability distribution correctly describe entropy?
Loading...