
#1
May2411, 11:03 AM

P: 25

Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counterintuitive logic of it.
For instance, I understand that the amount of uncertainty associated with a symbol is correlated with the amount of information the symbol carries. Hence, each symbol in English (64 upper and lowercase characters, a space, and punctuation marks) carries about 6 bits worth of information (2^6=64). However, the textbook also says that "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol". What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general? Thank you very much for your help. 



#2
May2411, 04:37 PM

P: 595

2^6 * 2^6 = 2^12 It's simple combinatorics. If you have a six bits you have 64 possible values for each character "slot". With two slots you have  64 in the first and 64 in the second  64*64 possible combinations. Shannon Information Theory then says that you have a maximum of 12 bits of information. The interesting thing is to then compare that value to the probabilities from the stream of characters you are _actually_ getting to see if your system has any (nonrandom) order to it. The funny thing is that "Information" should have been "Randomness" in this usage. This leads to the confusing thing that folks interpret "Information" to be "Meaning", but it's the opposite... 



#3
May2511, 12:46 AM

P: 6

(What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.) 



#4
May2511, 12:19 PM

P: 595

intuition for information theory 



#5
Jun811, 06:58 PM

P: 25

Thank you all for the wonderfully helpful response! It all makes sense now. :)



Register to reply 
Related Discussions  
Algorithmic information theory and the existence of a Unified Field Theory  General Physics  5  
Theory of information  Electrical Engineering  5  
Information Theory  Programming & Computer Science  3  
Information theory  Beyond the Standard Model  45  
What is Information Theory?  General Math  13 