General relativity and quantum mechanics is fundamentally about information.
... Charles Seife
... at the end of the universe entropy is at a maximum...information will be at a minimum.
The holographic principle is a property of quantum gravity and string theories which states that the description of a volume of space can be thought of as encoded on a boundary to the region—preferably a light-like boundary like a gravitational horizon.
[The prior one is especially weird: just where is information 'stored'...everytime I expand the horizon that's where the information resides...on the enclosed area not within the volume as intuition would suggest.]
Leonard Susskind in his book THE BLACK HOLE WAR (his controversy with Stephen Hawking)
has some really interesting insights on information and horizons...like
the horizon of a black hole is "stringy"...it can be described in terms of quantum strings...hidden information is proportional to the total LENGTH of a string!... Hawking radiation can be viewed as string bits breaking loose from just outside the horizon...due to quantum fluctuations...
a perspective akin to virtual particles causing the Hawking radiation with its release of irretrievably scrambled information.
Also, you have to wonder if according to digital sampling theory in which periodic samples of a continuous message at an appropriate sample rate can completely reproduce the information content, we really understand what is around us. Add 'compression' techniques to that, and it becomes more interesting...what is it about our universe that causes it to have 'unnecessary' [redundant] information...like maybe we have in our genes?? [I strongly suspect those 'unnecessary' genes have a LOT to tell use when we can decode their purposes.]
I found a [peer reviewed] paper for another discussion which proclaimed:
at submicroscopic scales there is no distinction betwteen analog and digital...they are one and the same...
finally, have you tried to tackle this:
http://en.wikipedia.org/wiki/Entropy_(information_theory)
If information isn't difficult enough to get a firm grasp, entropy is worse.
Maybe the rule is that instead of entropy inevitably increasing we should be looking more
closly at its analog, that information is inevitably decreasing.