1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Maximum Entropy Information System

  1. May 1, 2015 #1
    Imagine a maximum entropy information system: This system would hold meaningful information, not just random noise, but still be of maximum possible entropy in the sense that you could randomly change the order of the smallest bits of information in it without actually changing the overall information it contains. You can toss and dice the information bits as much as you want, the overall information stays the same. Kinda like taking all the words of War and Peace, putting them into a bucket, stirring properly, emptying the bucket on the blank pages and ending up with exactly the same exciting story.

    It is irrelevant what the system is made of, if it's physical or logical doesn't matter for this thought experiment. It is also irrelevant whether such a system is possible or not, lets just assume it does exist.

    Now, the thing I can't wrap my head around: What would be the implications of such a system? What other properties would the known properties imply? What would be the theoretical requirements for such a system to be possible in the first place? What would the properties of the system tell us about the properties of the information bits it contains? I would suspect there must be symmetries?

    Any thoughts on this very appreciated!
     
  2. jcsd
  3. May 1, 2015 #2

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    Professor Susskind offered an easy to think about system.

    Consider a system with six possible states. (A die will do.)

    If we know what state it is in, we have maximum knowledge and minimum entropy.

    If we know nothing about the state, we have maximum entropy and zero knowledge.

    Several intermediate values of entropy are possible.

    So, considering a simple die, what about the implications you were asking in the OP?
     
  4. May 1, 2015 #3
    Thanks for your reply anorlunda! I realize I need the rephrase my question:

    Lets assume we have a closed information system s that contains one information, lets say a sequence {A,B,C}. It is made of information bits, lets call them mBits. They are the smallest bit of information possible, the information they contain cannot be broken down. s contains n mBits that, as a whole, hold the information {A,B,C}. The special property of mBits is that the order in which they are stored is irrelevant. You can change their order in any way you want, the information stored in s will always be {A,B,C}, at least as long as you don't remove or add one. What exactly mBits are is irrelevant for this though experiment, we just assume they have this property.

    My conclusions are:
    - Because you can change s in any possible way without losing any of the information it contains, s is of maximum entropy
    - If we want to decode the information s contains we need to interpret s in some way. The interpretation inevitably has lower entropy than s.

    Are my conclusions correct? If not, what am I missing? If yes: Is s symmetrical? And if it is, in what way?
     
  5. May 2, 2015 #4

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    http://en.m.wikipedia.org/wiki/Information_theory

    http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6773024

    I don't know what you mean by change s in any possible way.

    The information and entropy contained in a system have nothing to do with decoding or interpretation.

    I suggest that you study information theory a little bit. The first link above takes you to the Wikipediaa article on information theory. The second link takes you to Claude Shannon's seminal paper A Mathematicsl Theory of Communication, which is quite easy to read. From those you will see how entropy is defined in information,
     
  6. May 2, 2015 #5

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

  7. May 2, 2015 #6
    Thanks anorlunda! Although I knew about information theory, you made me realize that I still mix up entropy as a measure of disorder with entropy as it is used in information theory.

    I guess, after all, what I'm trying to gasp is the following: How would one describe a system S in terms of entropy [as a measure of disorder] and symmetry, that does not appear to change to an outside observer if you change the internal order (spatial or logical) of its constituents. S is made up from non-uniform bits which have a high information entropy [as used in information theory].
     
  8. May 2, 2015 #7

    anorlunda

    User Avatar
    Science Advisor
    Gold Member

    By symmetry you may be thinking of combinations as opposed to permutations. It makes me think that the field of statistical mechchanics may satisfy some of your curiosity because it may well be related to what you're thinking about. May I suggest the video courses on Statistical Mechanics by Leonard Susskind. He is in an excellent teacher and the math is not difficult. I'm pretty sure you would enjoy the whole course and it may answer your questions.. Lecture 1 is embedded below. At the end of each lecture, it will give you a link to the next in the series.

     
  9. May 3, 2015 #8
    Thanks so much for taking your time anorlunda! You've given me very valuable input and pushed me into the right direction, I had a some interesting insights into my pet system. Professor Susskinds lessons will be next.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Maximum Entropy Information System
  1. Entropy and Information (Replies: 17)

  2. Entropy of a system (Replies: 2)

Loading...