Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy reversals

  1. Jun 1, 2012 #1
    I would like to make a stand regarding the topic of entropy reversal. Entropy CAN in fact be reduced in a closed system, and this happens spontaneously according to the fluctuation theorem. Its been published in a well known scientific journal over a decade ago, and beforehand has been established by many notable physicist.

    Take for instance the case of a particle suspended in a liquid. The particle will rise and fall in the liquid column thus gaining and losing energy, which comes from thermal energy of brownian motion. The system will cool when the particle rises and it will warm up when the particle descends. No matter how large the number of particles is, at equilibrium there is an equal chance that the system's entropy will increase and decrease/ system will heat or cool.
     
  2. jcsd
  3. Jun 2, 2012 #2
    The biggest problem I see is that your definition is flawed.

    If your "system" does not include the particle, then you have 2 open systems--one being the particle, the other being its surrounding/the rest of the liquid. All they're doing is exchanging entropy back and forth.

    If your system includes the particle + the rest of the liquid, then in this closed system the total entropy would not decrease.

    In either case, entropy of a closed system does not decrease. I doubt any well-known scientific journal or notable physicist would have overlooked such contradiction in definition.
     
    Last edited: Jun 2, 2012
  4. Jun 2, 2012 #3
    How about Jean Perrin? The guy who got a nobel prize for validating Einstein's work experimentally? It was over 100 years ago when he basically used the very same example in his book 'molecular nature of heat'. This was only a few years after brownian motion was discovered by a botanist

    It does though, that is what fluctuation theorem is.


    Yes, the particle + the rest of the liquid. The entropy does decrease because high entropy energy (thermal energy of liquid) becomes transformed into low entropy energy (gravitational potential of particle).
     
  5. Jun 2, 2012 #4
    Before trudging into muddy waters I'd suggest reading Feynman's Lectures on Physics, chapters 44,45 and 46.
     
  6. Jun 2, 2012 #5

    russ_watters

    User Avatar

    Staff: Mentor

    Yes. The 2nd law is statistical in nature and so there can be small, local fluctuations where entropy momentarily decreases.
    http://en.wikipedia.org/wiki/Fluctuation_theorem

    So what? Why do you need to "make a stand" over this? What is your point? It isn't your intention to restart an argument you created 8 months ago, is it?
     
  7. Jun 2, 2012 #6

    Ken G

    User Avatar
    Gold Member

    There are even simpler examples that are quite central to thermodynamics-- for example, put a two-level atom, with energy difference E, in contact with a thermal reservoir at T. The atom has an eE/kT higher probability of being in the lower level than the upper level, because the reservoir has that same factor more states available if it keeps the energy E rather than gives it up. So that's what sets the probability of exciting the atom, the fact that the reservoir has a higher probability of being in a state of higher entropy. That's really all the second law ever meant.
     
  8. Jun 2, 2012 #7
    A friend of mine always argued that "we" are evidence of significant decrease in entropy. The fact that humans and animals, plants, etc. can arise from the universe seemingly spontaneously whilst the universe around us is null can be interpreted this way. (we are composed of a very organized structure)

    Another notion (not by myself) along these lines can be stated that if the entropy in one part of the universe decreases (say the Earth) the entropy in another part must increase.
    This argument is even more controversial as the implication is that the state of entropy in the universe is somewhat constant, and when entropy increases in one part of the universe, it must decrease in another part. This would mean that there must be much more life out there then most of us think.

    Interesting notions.

    Cheers,
     
  9. Jun 2, 2012 #8


    Here is an example of the fluctuation theorem: two otherwise isolated bodies are in contact; one being warmer. There is some probability, depending on how long it lasts, that energy would flow from the cooler side to the warmer side. In other words, the closed system transistions from a more entropic occupation of microstates (i.e. distribution of energy) into a less entropic distribution of microstates.
     
    Last edited: Jun 2, 2012
  10. Jun 2, 2012 #9

    Ken G

    User Avatar
    Gold Member

    Your friend is making the single most common error in thermodynamics-- failing to account for the entire system. They do not understand entropy. It is completely consistent with thermodynamics that ordered systems emerge spontaneously, all that is required is the system be in contact with other systems that become very highly disordered. This principle plays out constantly, even in applications that have nothing at all to do with life or humans.

    I'll give you a simple example-- rolling a loaded dice. A loaded dice has a weight in it, so it tends to always come up the same thing, because that lets the weight be lower (releasing gravitational energy as heat into the environment). Coming up the same thing every time is a highly ordered and low-entropy result, but it happens spontaneously (if, say, an earthquake shook the dice) because the rest of the system gains entropy (in the form of heat). This has nothing to do with life, it holds on a lifeless planet as much as it does on ours, but life is a kind of analogous equivalent to the "loaded dice."
     
  11. Jun 3, 2012 #10
    Unfortunately I do not own the Feynman Lectures. Does this by any chance have to do with Faynman's ratchet? If I understand right, Faynman proposed a brownian ratchet and then went back saying that the 'quantum nature' will not allow a brownian particle to move a ratchet, because the ratchet will be inherently too 'fuzzy'.

    It seems that a lot of people are not aware of the fluctuation theorem, which is a shame. It is always useful to be able to contradict a big law of physics with concrete evidence. :)

    I am not sure I understand. I know a little bit about energy levels in atoms, and it is true that lower energy levels are inherently more populated than higher energy levels. The equation you give shows the probability of the atom being at either energy state. What do you mean by 'the reservoir has that same factor more states available if it keeps the energy E rather than gives it up.'?

    Yes. It doesn't happen often, but it does now and then. The question is whether or not the system can be monitored and stopped at the lower entropy state.
     
  12. Jun 3, 2012 #11
    Yes but monitoring the system would require a machine that gathers vasts amounts of data. This data would need to be eventually deleted and the process of deleting this data would generate heat restoring the entropy conditions . Thus your argument is flawed, as all computing machines generate heat when deleting bits of data and hence entropy is always increasing.

    What you are describing is some type of perpetual motion system.
     
  13. Jun 3, 2012 #12
    Entropy of information has been used to 'defeat' maxwell's demon, but I think that it is unimaginative to consider digital as the only method of data storage and monitoring. Data comparisons can be made and orders can be executed without storing any information at all, using physical systems. For instance, if we assume that the gas which maxwell's demon is sorting is an ionic gas/plasma, we know that particles will exert local electric and magnetic fields. These can be used in conjecture with a nano-diode that would bias the passage of ions in one direction only, without ever storing a bit of information!
     
  14. Jun 3, 2012 #13
    Hmmm.

    - D.J. Evans & D.J. Searles (2002) "The Fluctuation Theorem." Advances in Physics Vol. 51, No. 7, 1529 - 1585.

    I would say that the fluctuation theorem clarifies the second law of thermodynamics, rather than contradicts it. Even Denis Evans - you know, that Denis Evans - has this gem on his professional page:

    :cool:
     
  15. Jun 3, 2012 #14

    Ken G

    User Avatar
    Gold Member

    That last statement is the reason that the atom is more likely to be in the lower level. It is purely a matter of counting states in the reservoir, and noting what factor is that number lowered by if the energy E goes from the reservoir to excite the atom. This is the meaning of temperature, in fact. My point is that the atom has only one state either way, so its entropy is zero either way, but the reservoir has higher entropy if it keeps energy E-- yet it will sometimes be found to give up that energy. This is because the second law is only a general rule of thumb that only graduates to a "law" when you have a vastly different number of states that you are contrasting, not just the eE/kT ratio that appears when you remove a tiny energy E from the reservoir. In the latter type of situation, i.e. using a vast reservoir to excite a single atom, no one in their right mind would expect the second law to apply. So we have to understand that law in order to use it properly.
    But this is all very elementary, people who really understand thermodynamics are way past this kind of issue. It does bear repeating from time to time, to be sure, but it sounds like you are overinterpreting it, and putting too much stock into a relatively trivial application of what statistical mechanics is all about.
     
  16. Jun 3, 2012 #15

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    AFAIK, it cannot. Measurement and transmission of information is associated with a flow of entropy (or negentropy if you prefer). See, for example, "Maxwell's demon" and the solution to the Gibbs paradox.
     
  17. Jun 5, 2012 #16
    Yes!

    I know that his experiment (the one with a bead in a laser beam) doesn't show an overall decrease in entropy, but rather momentery decreases in entropy of a macro-scale system. I don't think that it is impossible though, and I have a hunch that he could prove that as well if he really wanted to, although that might not get published.

    Okay, I think I see what you mean. That is exactly the point, on small time and energy scales there can be reversals. It is much like a liquid and the partial pressure of it's gas phase. This pressure is only constant when evaporationa and condensation are at an equilibrium, so a lot of small scale 'fluctuations' still occur when more molecules simultaneously evaporate than condense. It just doesn't affect the macro-scale properties of the gas.

    Still, no one has made a machine that sucks up thermal energy and creats electricity. It seems that this fluctuation theorem would add credulence to its feasibility. Wouldn't you agree?

    I just don't buy into the 'entropy of information' idea. Sure, it is a real concept when working with digital systems. But if one for instance thinks of a scale tipping, the scale does not need any information regarding what it is holding in order to tip one way or the other. It just acts physically on forces exerted by gravity. This is an example of a system that operates without any information transfer, unless I do not wholly understand the concept.
     
  18. Jun 5, 2012 #17
    To clarify, it's more like a part of the cool side may be warmer than a part of the warm side; hence kinetic energy flows from that part of the cool side that's actually warmer to that part of the warm side that's actually cooler. This only applies to a part of the system. If considering the entire system or any system in entirety, kinetic energy does not flow from cooler to warmer.

    It may still be possible for entropy to decrease, but it doesn't last long. Most of the time, entropy is rising until it plateaus at a maximum. An alternative scenario may be 0 entropy from beginning to end. Anyways, my original post was confusing and not 100% correct.
     
  19. Jun 5, 2012 #18
    Entropy is the fulfillment/usage/occupation of degrees of freedom. It literally is information.

    A balance tips until it exhausts information (minimizes potential energy; maximizes kinetic energy), at which point it no longer works like a balance should and starts to oscillate microscopically around equilibrium. This is entropy taking its course. Whatever information you gain from the balance, the balance and the surrounding must lose. Same thing with a spring scale, in which case potential and kinetic energy keeps converting to one another. As long as the scale is working, it must release entropy one way or another. The minute it can no longer release entropy, it will get stuck, and its information will no longer be updated.

    You might say, well if time were to stand still, then I still have some information. But the discussion is moot if time were to stand still, because stopping time by definition stops everything. So yeah, if time were to stop, then physics might be different.
     
    Last edited: Jun 5, 2012
  20. Jun 5, 2012 #19

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor
    2016 Award

    You don't have to- science does not conform to your opinion.
     
  21. Jun 5, 2012 #20
    Yes, that is correct in just about all cases. But once at the plateau there is an equal probability of momentary entropy increase and decrease.

    Are degrees of freedom literally information? I do not see it that way. Degrees of freedom count the number of possible configurations of a system.. .. now to define information, it is a little bit harder.
    I say infromation is a man-made concept used to denote man-made objects which store knowledge. To equate a man-made concept with a physical value, such as entropy, is surely false.

    Even if entropy does increase once the balance tips, no information is required for it to 'decide' which way to tip. If we have two objects of equal weight on opposite sides of the balance, it will 'know' not to tip without expending any entropy.


    Entropy of information is a part of information theory, not physical science. Please take a look at the wiki:

    I REALLY don't think physicists or chemists would be measuring standard entropy based on human experiments.



    I declare that there are long-lived macro scale exceptions to the second law of thermodynamics.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Entropy reversals
  1. Reversing Entropy (Replies: 13)

Loading...