Register to reply

Intuition for information theory

Share this thread:
nigels
#1
May24-11, 11:03 AM
P: 27
Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counter-intuitive logic of it.

For instance,

I understand that

the amount of uncertainty associated with a symbol is correlated with the amount of information the symbol carries. Hence, each symbol in English (64 upper- and lowercase characters, a space, and punctuation marks) carries about 6 bits worth of information (2^6=64). However, the textbook also says that "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".

What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?

Thank you very much for your help.
Phys.Org News Partner Physics news on Phys.org
UCI team is first to capture motion of single molecule in real time
And so they beat on, flagella against the cantilever
Tandem microwave destroys hazmat, disinfects
schip666!
#2
May24-11, 04:37 PM
P: 595
Quote Quote by nigels View Post
What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?
Why can't you sum them up?

2^6 * 2^6 = 2^12

It's simple combinatorics. If you have a six bits you have 64 possible values for each character "slot". With two slots you have -- 64 in the first and 64 in the second -- 64*64 possible combinations. Shannon Information Theory then says that you have a maximum of 12 bits of information.

The interesting thing is to then compare that value to the probabilities from the stream of characters you are _actually_ getting to see if your system has any (non-random) order to it. The funny thing is that "Information" should have been "Randomness" in this usage. This leads to the confusing thing that folks interpret "Information" to be "Meaning", but it's the opposite...
navaburo
#3
May25-11, 12:46 AM
P: 6
Quote Quote by nigels View Post
"no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".
What is meant here is that no more than 6-bits PER SYMBOL can be contained in an English message, not 6-bits total.

(What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6-bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.)

schip666!
#4
May25-11, 12:19 PM
P: 595
Intuition for information theory

Quote Quote by navaburo View Post
(What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6-bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.)
That is, in fact, the key to data compression...
nigels
#5
Jun8-11, 06:58 PM
P: 27
Thank you all for the wonderfully helpful response! It all makes sense now. :)


Register to reply

Related Discussions
Algorithmic information theory and the existence of a Unified Field Theory General Physics 5
Theory of information Electrical Engineering 5
Information Theory Programming & Computer Science 3
Information theory Beyond the Standard Model 45
What is Information Theory? General Math 13