Question about Entropy/Shannon Information

  • Thread starter celal777
  • Start date
  • Tags
    Information
In summary, Celal says the entropy has increased because of random action, but doesn't think the information has increased.
  • #1
celal777
11
0
Hello,

I would like second (and more) opinions on comments i received from a physicist in response to my observations on a certain experimental setup designed to make a particular point about information theory in the context of entropy.

The experimental narrative is found on pages 9 & 10 of the following document :
http://www.colorado.edu/philosophy/vstenger/Found/04Message.pdf [Broken]

A diagram of what is discussed in given at : http://www.colorado.edu/philosophy/vstenger/Found/042Barmag.pdf [Broken]

What follows is the discussion that ensued between "me" and "my respondent":

ME :
1) You started out with a more highly ordered state (magnets in contact N-S & S-N) .
2) You opened the window
3) Magnets scattered - so magnets in a more disordered state
i.e. (entropy is increased =information is decreased)
4) Result : yet you say "information in the system has increased by
one bit"(p.10) i.e. (entropy is decreased= information is increased)

How is this possible ? Why are the conclusions in #3 & #4
different ?

The bits at the end of the experiment tell me there is more
information yet the system is also more disordered hence there is more
entropy which by definition means there is less information. The
contradiction in #3 & #4 below remains, doesn't it ? What am i missing
here ?

MY RESPONDENT:
You are using an intuitive definition of entropy rather than a
quantitative one, assuming the system is less orderly. Since entropy is
negative information, the entropy has decreased and so the system is
more orderly.
==========

My comments and questions for the Physics Help Forum :
-------------------------------------------------------------------------
Why is my respondant calling my interpretation "intuitive" ? Does he mean "wrong" ?
Is the system's final state really "more orderly" as he says ?
Is it possible that his experiment is really not demonstrating what he thinks it demonstrates either in terms of entropy or information theory ?

Many Thanks in advance for all comments received,

Celal Berker
London, England
 
Last edited by a moderator:
Science news on Phys.org
  • #2
Celal,

I think your respondent is wrong when saying that the entropy has decreased. It has increased because it's the result of random action.

But I doubt that the information has increased with the scattering. Wouldn't there be some way to retrieve the orientation of both magnets when they are on top of each other?

And in general, it is true that entropy in general does decrease information. It may increase the quantity of data, but not necessarily information. Noise on the transmission cables, for example, usually will corrupt the bits. They may still remain as data, but the correct information has decreased as Shannon's law predicts. In your experiment, suppose the magnets fall on the opposite orientation. Isn't that a corruption of data, and hence a decrease in information?

Regards
 
Last edited:
  • #4
As I already said, I don't know if 4 bits of information result in the final system. They are random and don't make any sense. The original bits are lost by the event, so there is also a resulting loss from 2 bits to 0 bits of useful infromation.
I'm not an expert in Information Theory and it's formulas, but if they relate entropy with any pattern of bits, I begin to doubt them.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness of a system. In information theory, it is a measure of the uncertainty or unpredictability of a message or data. In thermodynamics, it is a measure of the amount of energy in a system that is unavailable for work.

2. How is entropy related to information?

In information theory, entropy is used to quantify the amount of information contained in a message or data. The higher the entropy, the more uncertain or unpredictable the message is, and thus the more information it contains. This is because a message with high entropy requires more bits to represent it, while a message with low entropy can be compressed and represented with fewer bits.

3. What is the connection between entropy and disorder?

In both information theory and thermodynamics, entropy is closely related to disorder. In information theory, high entropy signifies high disorder or randomness, while low entropy represents low disorder or predictability. In thermodynamics, entropy is a measure of the amount of energy in a system that is unavailable for work, and this energy is often associated with disorder.

4. How is Shannon Information related to entropy?

Shannon Information, also known as Shannon Entropy, is a measure of the average amount of information contained in a message. It is derived from entropy and is often used in information theory to quantify the amount of uncertainty or randomness in a message. The higher the Shannon Information, the more uncertain or unpredictable the message is.

5. What are some real-life applications of entropy and Shannon Information?

Entropy and Shannon Information have various applications in different fields. In computer science and data compression, they are used to optimize data storage and communication. In physics and chemistry, they are used to understand the thermodynamic properties of systems and chemical reactions. In biology, they are used to study genetic information and DNA sequences. In finance, they are used to analyze stock market data. In short, entropy and Shannon Information have widespread applications in fields that deal with information and uncertainty.

Similar threads

  • Thermodynamics
Replies
2
Views
721
Replies
1
Views
860
Replies
4
Views
1K
  • Thermodynamics
Replies
1
Views
2K
Replies
3
Views
922
Replies
3
Views
1K
  • Thermodynamics
Replies
7
Views
2K
  • Thermodynamics
Replies
1
Views
1K
Replies
1
Views
770
Back
Top