Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A naive question about entropy.

  1. Feb 17, 2008 #1
    I've constantly heard professors answer questions about entropy with a statement about how even if the entropy of a system is decreasing the entropy of the universe is increasing. How can we discuss the entropy of the universe if information can't be communicated between all points in space at the same time? Or rather, how does relativity mix with entropy, if at all?

    edit: I just realized that this might have more to do with relativity than with thermodynamics. Moderators feel free to move this post.
    Last edited: Feb 17, 2008
  2. jcsd
  3. Feb 17, 2008 #2

    Claude Bile

    User Avatar
    Science Advisor

    You can make arguments based on the isotropic nature of the universe, that if a law holds somewhere, it must hold at every other point. In other words, the behaviour of a thermodynamic system shouldn't depend on where in the universe the system happens to be sitting.

  4. Feb 17, 2008 #3
    I always understood entropy as the number of ways something can be arranged. Since the universe is expanding, there are constantly more ways in which things can be arranged, thus entropy must be increasing.
  5. Feb 17, 2008 #4


    Staff: Mentor

    I have generally heard it expressed that the entropy of any closed system is always increasing. I guess the extension to the universe is just considering the universe as a whole one big closed system.
  6. Feb 18, 2008 #5


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You should consider this statement as more elementary than it is made to sound. What is indeed considered a true law, is that entropy in a sufficiently closed system is always increasing. So if you have a subsystem in which entropy is decreasing, that means it is not closed enough, and is interacting with some part of its neighbourhood (for instance, exchanging heat or matter with it). So you now have to include that neighbourhood in the system. If you do that sufficiently, then, for the process at hand, you will sooner or later be able to indicate that the system is now big enough so that no relevant interaction you are considering still makes it "open".

    Some silly examples: Consider water in a bucket on a cold day. It freezes. Entropy of the water has decreased ! Ok, but you realise that your bucket is not "closed". The water lost some heat to the soil and the air. So you now include a chunk of soil and air. CONCERNING the cooling bucket, that will do. The soil will heat a bit, and the air will heat a bit, and their increase of entropy because of that will be larger than the decrease in entropy of the water.
    But of course, that soil and air will be exposed to other processes (earth's atmosphere, sun, ....). So it might be that overall, their entropy is also decreasing. But this didn't have anything to do anymore with the bucket. So concerning the process of freezing water, it is sufficient to include just the soil and the air of the immediate neighbourhood. But in all generality, you'd have to include the earth atmosphere, the earth, the sun, the solar system, the galaxy,....
    Only, from a certain point onwards, you realise that this doesn't have anything to do anymore with the freezing bucket. That's what's meant with a sufficiently closed system.
  7. Feb 18, 2008 #6


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    To answer your question, entropy is (para)conserved locally, so no superluminal transmission of information is needed. Entropy is created whenever a flux arises due to a gradient in energy, matter, momentum, etc., and these fluxes are always at or slower than the speed of light.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: A naive question about entropy.