If entropy always increases, how can information be conserved?

Click For Summary
The discussion explores the relationship between entropy and information, highlighting the apparent contradiction between the conservation of information and the increase of entropy in isolated systems. It emphasizes that while maximum entropy can occur in states with minimal information, such as a uniformly dispersed gas, the extraction of information from particles leads to an increase in entropy. The conversation touches on Landauer's principle, which states that erasing information requires energy and increases entropy, while reversible operations do not. The complexity of understanding these concepts is acknowledged, with references to Maxwell's Demon and the nuances of information theory. Ultimately, the dialogue underscores that information and entropy are interconnected, representing different aspects of the same fundamental principles.
Battlemage!
Messages
292
Reaction score
44
Many scientists believe information is conserved, but it seems that in an isolated system entropy isn't. These two things seem incompatible to me. Would anyone care to enlighten me about this? Thank you.
 
Science news on Phys.org
The relationship between entropy and information is subtle and complex. And I'm still pretty sure nobody exactly understands entropy yet.

Suppose I give you a box of gas and ask you what you think the distribution of the gas is. A logical guess is equally dispersed, right? That would not be a surprising answer...it doesn't have a lot of information, yet entropy is at maximum.

Now let's put it in a really strong graviatational field: now the "most likely" least information state would be "clumpy", maybe like the universe...again entropy is maximum...so what you observe as "information" and "entropy" are related, but not always in an apparent way.

Measurement is an extraction of information from a particle that does not come for free...something about that information such as extracting or processing it increases the entropy of the universe. So information goes down and entropy goes up.
This example provides a little insight into what seems like a possible contradiction : Nature attempts to dissipate stored information just as it attempts to increase entropy; the ideas are two sides of the same coin.

But there are some very cionfusing aspects to this...such as the fact that only erasure, in a computer for example, costs energy...you can add bits or multiply bits, for example, or negate bits, without increasing energy/entropy. Check out Landauer's principle. Reversible operations don't increase entropy, reversible ones do. I'm not sure I REALLY understand that at all...I guess that's why it's a "principle".

In general, information as the answer to some question should reduce your uncertainty about which among possible answers is correct. But a source of a message should have HIGH uncertainty, otherwise you'll know what the message contains, say bits 1,1,1,1,1,1,1,1,1,1,1,1,1...two sides of the same coin.

One way to gain some insights is to get a reference that discusses, say, dropping marbles in a box...say a pair that are identical, and separately some that are different...

A good book on all this is Charle's Seife's DECODING THE UNIVERSE.

This might be of interest:

Maxwells Demon: http://en.wikipedia.org/wiki/Maxwell's_demon

(Had physicsts stumped for over 100 years)

and
http://en.wikipedia.org/wiki/Reversible_computing
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K