Maximum Entropy Information System

In summary, a maximum entropy information system would be a physical or logical system that contains meaningful information and still be of maximum possible entropy. It does not matter what the system is made of, and the properties of the system tell us about the properties of the information bits it contains.
  • #1
jktales
12
0
Imagine a maximum entropy information system: This system would hold meaningful information, not just random noise, but still be of maximum possible entropy in the sense that you could randomly change the order of the smallest bits of information in it without actually changing the overall information it contains. You can toss and dice the information bits as much as you want, the overall information stays the same. Kinda like taking all the words of War and Peace, putting them into a bucket, stirring properly, emptying the bucket on the blank pages and ending up with exactly the same exciting story.

It is irrelevant what the system is made of, if it's physical or logical doesn't matter for this thought experiment. It is also irrelevant whether such a system is possible or not, let's just assume it does exist.

Now, the thing I can't wrap my head around: What would be the implications of such a system? What other properties would the known properties imply? What would be the theoretical requirements for such a system to be possible in the first place? What would the properties of the system tell us about the properties of the information bits it contains? I would suspect there must be symmetries?

Any thoughts on this very appreciated!
 
Science news on Phys.org
  • #2
Professor Susskind offered an easy to think about system.

Consider a system with six possible states. (A die will do.)

If we know what state it is in, we have maximum knowledge and minimum entropy.

If we know nothing about the state, we have maximum entropy and zero knowledge.

Several intermediate values of entropy are possible.

So, considering a simple die, what about the implications you were asking in the OP?
 
  • #3
Thanks for your reply anorlunda! I realize I need the rephrase my question:

Lets assume we have a closed information system s that contains one information, let's say a sequence {A,B,C}. It is made of information bits, let's call them mBits. They are the smallest bit of information possible, the information they contain cannot be broken down. s contains n mBits that, as a whole, hold the information {A,B,C}. The special property of mBits is that the order in which they are stored is irrelevant. You can change their order in any way you want, the information stored in s will always be {A,B,C}, at least as long as you don't remove or add one. What exactly mBits are is irrelevant for this though experiment, we just assume they have this property.

My conclusions are:
- Because you can change s in any possible way without losing any of the information it contains, s is of maximum entropy
- If we want to decode the information s contains we need to interpret s in some way. The interpretation inevitably has lower entropy than s.

Are my conclusions correct? If not, what am I missing? If yes: Is s symmetrical? And if it is, in what way?
 
  • #4
http://en.m.wikipedia.org/wiki/Information_theory

http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6773024

I don't know what you mean by change s in any possible way.

The information and entropy contained in a system have nothing to do with decoding or interpretation.

I suggest that you study information theory a little bit. The first link above takes you to the Wikipediaa article on information theory. The second link takes you to Claude Shannon's seminal paper A Mathematicsl Theory of Communication, which is quite easy to read. From those you will see how entropy is defined in information,
 
  • #6
Thanks anorlunda! Although I knew about information theory, you made me realize that I still mix up entropy as a measure of disorder with entropy as it is used in information theory.

I guess, after all, what I'm trying to gasp is the following: How would one describe a system S in terms of entropy [as a measure of disorder] and symmetry, that does not appear to change to an outside observer if you change the internal order (spatial or logical) of its constituents. S is made up from non-uniform bits which have a high information entropy [as used in information theory].
 
  • #7
By symmetry you may be thinking of combinations as opposed to permutations. It makes me think that the field of statistical mechchanics may satisfy some of your curiosity because it may well be related to what you're thinking about. May I suggest the video courses on Statistical Mechanics by Leonard Susskind. He is in an excellent teacher and the math is not difficult. I'm pretty sure you would enjoy the whole course and it may answer your questions.. Lecture 1 is embedded below. At the end of each lecture, it will give you a link to the next in the series.

 
  • Like
Likes jktales
  • #8
Thanks so much for taking your time anorlunda! You've given me very valuable input and pushed me into the right direction, I had a some interesting insights into my pet system. Professor Susskinds lessons will be next.
 

Related to Maximum Entropy Information System

1. What is Maximum Entropy Information System?

Maximum Entropy Information System (MEIS) is a statistical framework used to analyze and model complex systems with limited data. It is based on the principle of maximum entropy, which states that the most unbiased probability distribution for a given set of constraints is the one with the highest entropy.

2. How is MEIS used in scientific research?

MEIS is commonly used in fields such as physics, biology, ecology, and economics to analyze and model complex systems. It is particularly useful in situations where there is limited data available, as it allows for the incorporation of prior knowledge and constraints to make predictions and draw conclusions.

3. What are the advantages of using MEIS?

MEIS has several advantages, including its ability to handle complex and high-dimensional data, its robustness to noise and outliers, and its ability to incorporate prior knowledge and constraints. It also provides a principled approach to data analysis and allows for the quantification of uncertainty in predictions.

4. Are there any limitations to using MEIS?

One limitation of MEIS is that it requires a large amount of computing power and can be computationally expensive. It also relies on the choice of appropriate constraints and the accuracy of the prior knowledge used. Additionally, MEIS may not be suitable for analyzing small datasets or datasets with a large number of variables.

5. What are some real-world applications of MEIS?

MEIS has been used in a variety of applications, including image and signal processing, natural language processing, and financial modeling. It has also been applied to study complex systems such as climate change, ecological systems, and protein folding. MEIS has shown promising results in these areas and continues to be an active area of research.

Similar threads

Replies
13
Views
1K
Replies
17
Views
1K
Replies
12
Views
1K
Replies
22
Views
2K
Replies
2
Views
852
  • Beyond the Standard Models
Replies
1
Views
827
  • Special and General Relativity
Replies
4
Views
1K
Replies
12
Views
2K
  • Thermodynamics
Replies
7
Views
2K
Back
Top