- #1
nigels
- 36
- 0
Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counter-intuitive logic of it.
For instance,
I understand that
the amount of uncertainty associated with a symbol is correlated with the amount of information the symbol carries. Hence, each symbol in English (64 upper- and lowercase characters, a space, and punctuation marks) carries about 6 bits worth of information (2^6=64). However, the textbook also says that "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".
What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?
Thank you very much for your help.
For instance,
I understand that
the amount of uncertainty associated with a symbol is correlated with the amount of information the symbol carries. Hence, each symbol in English (64 upper- and lowercase characters, a space, and punctuation marks) carries about 6 bits worth of information (2^6=64). However, the textbook also says that "no more than this amount of info can be contained in an English message, because we can in fact encode any such message in binary form using 6 binary digits per symbol".
What this means is that, in a 2 symbol message, if each message contains 6 bits of info, I can't sum the two up and say the whole message contains 12 bits of info. Why is this? What part of my intuition needs to be tweaked? How should I think about this in general?
Thank you very much for your help.