# intuition for information theory

by nigels
Tags: bits, code, entropy, information theory, signal
 P: 25 Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counter-intuitive logic of it. For instance, I understand that the amount of uncertainty associated with a symbol is correlated with the amount of information the symbol carries. Hence, each symbol in English (64 upper- and lowercase characters, a space, and punctuation marks) carries about 6 bits worth of information (2^6=64). However, the textbook also says that "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol". What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general? Thank you very much for your help.
P: 595
 Quote by nigels What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?
Why can't you sum them up?

2^6 * 2^6 = 2^12

It's simple combinatorics. If you have a six bits you have 64 possible values for each character "slot". With two slots you have -- 64 in the first and 64 in the second -- 64*64 possible combinations. Shannon Information Theory then says that you have a maximum of 12 bits of information.

The interesting thing is to then compare that value to the probabilities from the stream of characters you are _actually_ getting to see if your system has any (non-random) order to it. The funny thing is that "Information" should have been "Randomness" in this usage. This leads to the confusing thing that folks interpret "Information" to be "Meaning", but it's the opposite...
P: 6
 Quote by nigels "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".
What is meant here is that no more than 6-bits PER SYMBOL can be contained in an English message, not 6-bits total.

(What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6-bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.)

P: 595

## intuition for information theory

 Quote by navaburo (What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6-bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.)
That is, in fact, the key to data compression...
 P: 25 Thank you all for the wonderfully helpful response! It all makes sense now. :)

 Related Discussions General Physics 5 Electrical Engineering 5 Programming & Computer Science 3 Beyond the Standard Model 45 General Math 13