Intuition for information theory

AI Thread Summary
Understanding information theory reveals that the uncertainty of a symbol correlates with the information it carries, typically quantified as 6 bits for English characters. However, one cannot simply sum the bits of multiple symbols because the total information is determined by the combinations of those symbols, not a direct addition. This leads to the realization that while each symbol can carry 6 bits, the actual information content is often less due to predictable patterns in language, which reduces entropy. The discussion highlights the distinction between "information" and "meaning," emphasizing that information theory focuses on randomness rather than semantic content. Ultimately, recognizing these principles is crucial for grasping concepts like data compression.
nigels
Messages
36
Reaction score
0
Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counter-intuitive logic of it.

For instance,

I understand that

the amount of uncertainty associated with a symbol is correlated with the amount of information the symbol carries. Hence, each symbol in English (64 upper- and lowercase characters, a space, and punctuation marks) carries about 6 bits worth of information (2^6=64). However, the textbook also says that "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".

What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?

Thank you very much for your help.
 
Physics news on Phys.org
nigels said:
What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?

Why can't you sum them up?

2^6 * 2^6 = 2^12

It's simple combinatorics. If you have a six bits you have 64 possible values for each character "slot". With two slots you have -- 64 in the first and 64 in the second -- 64*64 possible combinations. Shannon Information Theory then says that you have a maximum of 12 bits of information.

The interesting thing is to then compare that value to the probabilities from the stream of characters you are _actually_ getting to see if your system has any (non-random) order to it. The funny thing is that "Information" should have been "Randomness" in this usage. This leads to the confusing thing that folks interpret "Information" to be "Meaning", but it's the opposite...
 
nigels said:
"no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".

What is meant here is that no more than 6-bits PER SYMBOL can be contained in an English message, not 6-bits total.

(What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6-bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.)
 
navaburo said:
(What I find more interesting is the amount of entropy in an English message; that is, how there is really a lot less than 6-bits of info carried by each symbol in a message ON AVERAGE. This is because there are common patterns that repeat with high probability. For example, say you have just read a "t"; you would not be terribly surprised to read an "h", since so many words contain the "th" combination. However, you would be surprised if an "m" followed a "t". I find that amusing anyways.)

That is, in fact, the key to data compression...
 
Thank you all for the wonderfully helpful response! It all makes sense now. :)
 
I think it's easist first to watch a short vidio clip I find these videos very relaxing to watch .. I got to thinking is this being done in the most efficient way? The sand has to be suspended in the water to move it to the outlet ... The faster the water , the more turbulance and the sand stays suspended, so it seems to me the rule of thumb is the hose be aimed towards the outlet at all times .. Many times the workers hit the sand directly which will greatly reduce the water...
Back
Top