1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

If entropy always increases, how can information be conserved?

  1. Jul 3, 2011 #1
    Many scientists believe information is conserved, but it seems that in an isolated system entropy isn't. These two things seem incompatible to me. Would anyone care to enlighten me about this? Thank you.
     
  2. jcsd
  3. Jul 3, 2011 #2
    The relationship between entropy and information is subtle and complex. And I'm still pretty sure nobody exactly understands entropy yet.

    Suppose I give you a box of gas and ask you what you think the distribution of the gas is. A logical guess is equally dispersed, right? That would not be a surprising answer...it doesn't have a lot of information, yet entropy is at maximum.

    Now let's put it in a really strong graviatational field: now the "most likely" least information state would be "clumpy", maybe like the universe....again entropy is maximum....so what you observe as "information" and "entropy" are related, but not always in an apparent way.

    Measurement is an extraction of information from a particle that does not come for free...something about that information such as extracting or processing it increases the entropy of the universe. So information goes down and entropy goes up.
    This example provides a little insight into what seems like a possible contradiction : Nature attempts to dissipate stored information just as it attempts to increase entropy; the ideas are two sides of the same coin.

    But there are some very cionfusing aspects to this...such as the fact that only erasure, in a computer for example, costs energy...you can add bits or multiply bits, for example, or negate bits, without increasing energy/entropy. Check out Landauer's principle. Reversible operations don't increase entropy, reversible ones do. I'm not sure I REALLY understand that at all...I guess that's why it's a "principle".

    In general, information as the answer to some question should reduce your uncertainty about which among possible answers is correct. But a source of a message should have HIGH uncertainty, otherwise you'll know what the message contains, say bits 1,1,1,1,1,1,1,1,1,1,1,1,1...........two sides of the same coin.

    One way to gain some insights is to get a reference that discusses, say, dropping marbles in a box...say a pair that are identical, and separately some that are different....

    A good book on all this is Charle's Seife's DECODING THE UNIVERSE.

    This might be of interest:

    Maxwells Demon: http://en.wikipedia.org/wiki/Maxwell's_demon

    (Had physicsts stumped for over 100 years)

    and
    http://en.wikipedia.org/wiki/Reversible_computing
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook