If entropy always increases, how can information be conserved?

In summary, the concept of entropy and information are related but not always in an obvious way. While entropy is often associated with disorder and randomness, information can also be seen as a measure of uncertainty. The extraction and processing of information increases the entropy of the universe, but reversible operations do not. This relationship between information and entropy is complex and has been a topic of debate in physics for many years. Landauer's principle and Maxwell's demon are just some examples of the fascinating concepts that further illustrate this relationship.
  • #1
Battlemage!
294
45
Many scientists believe information is conserved, but it seems that in an isolated system entropy isn't. These two things seem incompatible to me. Would anyone care to enlighten me about this? Thank you.
 
Science news on Phys.org
  • #2
The relationship between entropy and information is subtle and complex. And I'm still pretty sure nobody exactly understands entropy yet.

Suppose I give you a box of gas and ask you what you think the distribution of the gas is. A logical guess is equally dispersed, right? That would not be a surprising answer...it doesn't have a lot of information, yet entropy is at maximum.

Now let's put it in a really strong graviatational field: now the "most likely" least information state would be "clumpy", maybe like the universe...again entropy is maximum...so what you observe as "information" and "entropy" are related, but not always in an apparent way.

Measurement is an extraction of information from a particle that does not come for free...something about that information such as extracting or processing it increases the entropy of the universe. So information goes down and entropy goes up.
This example provides a little insight into what seems like a possible contradiction : Nature attempts to dissipate stored information just as it attempts to increase entropy; the ideas are two sides of the same coin.

But there are some very cionfusing aspects to this...such as the fact that only erasure, in a computer for example, costs energy...you can add bits or multiply bits, for example, or negate bits, without increasing energy/entropy. Check out Landauer's principle. Reversible operations don't increase entropy, reversible ones do. I'm not sure I REALLY understand that at all...I guess that's why it's a "principle".

In general, information as the answer to some question should reduce your uncertainty about which among possible answers is correct. But a source of a message should have HIGH uncertainty, otherwise you'll know what the message contains, say bits 1,1,1,1,1,1,1,1,1,1,1,1,1...two sides of the same coin.

One way to gain some insights is to get a reference that discusses, say, dropping marbles in a box...say a pair that are identical, and separately some that are different...

A good book on all this is Charle's Seife's DECODING THE UNIVERSE.

This might be of interest:

Maxwells Demon: http://en.wikipedia.org/wiki/Maxwell's_demon

(Had physicsts stumped for over 100 years)

and
http://en.wikipedia.org/wiki/Reversible_computing
 

1. How can entropy increase if information is conserved?

While entropy is a measure of disorder or randomness in a system, information refers to the organization and structure of that system. When entropy increases, it means that there is a decrease in the amount of usable energy in the system, but this does not necessarily mean that information is lost. In fact, as entropy increases, information can become more concentrated and organized in certain areas of the system.

2. Isn't the second law of thermodynamics contradictory to the conservation of information?

The second law of thermodynamics states that entropy in a closed system will always increase over time. This is because energy will naturally disperse and become less organized. However, this law does not apply to open systems, where energy and matter can enter and exit. In these systems, information can be conserved even as entropy increases.

3. How does the concept of negentropy relate to the conservation of information?

Negentropy, or negative entropy, is a concept that refers to the process of creating order and reducing entropy in a system. This is often seen in living organisms, where energy is used to maintain order and complexity. In these systems, information can be conserved even as entropy increases, as the energy used to maintain order is constantly replenished.

4. Can information be destroyed as entropy increases?

No, information cannot be destroyed or lost in a closed system. This is because information is not a physical entity, but rather a concept that describes the organization of a system. While entropy can increase, causing changes in the physical state of the system, the information that describes the system's organization remains intact.

5. How does the concept of entropy apply to information theory?

In information theory, entropy is used to measure the uncertainty or randomness of a system. As the amount of information in a system increases, the entropy also increases. However, this does not mean that information is lost or destroyed, as the information can still be retrieved and organized in a meaningful way. In this sense, the conservation of information still applies in information theory.

Similar threads

Replies
13
Views
1K
  • Thermodynamics
Replies
4
Views
343
  • Thermodynamics
Replies
3
Views
1K
Replies
17
Views
1K
Replies
12
Views
1K
  • Thermodynamics
Replies
3
Views
770
  • Thermodynamics
Replies
1
Views
719
  • Special and General Relativity
Replies
7
Views
255
  • Thermodynamics
Replies
26
Views
1K
  • Thermodynamics
Replies
2
Views
760
Back
Top