- #1

- 96

- 0

I'm a 4th year computer engineering student who is just at the end of an Information theory class. It was nice, I thought I got a grip on all the concepts and so...

But, I tried to explain to my girlfriend today the basics of information theory, and I failed to do so. The basic problem was (not entropy, can you imagine that) the quantity of information.

I started with the basic example a toss of a fair coin. And all was clear, while the coin is spinning in the air, we don't know what will occur and entropy is one bit.

When the coin lands, we receive one bit of information. Everything is nice and clear, there is a link between the symbol and a quantity.

But then the unfair coin showed. Which had a higher chance of heads appearing and all was lost.

At first a claimed (as was teached at school), "The amount of information gets higher as the probability of occurrence lowers". After i said it a couple of times out loud, it just didn't feel right anymore.

When i flip an unfair coin (with a higher heads chance) and tails occur, why didn't I, after the uncertainty is resolved, end up with 1 bit of information?

And then again, if a have a chance for tails 0.01, and tails occur. I still end up with less bits of information than if the same event happened with probability of 0.5.

Does anyone know the motivation for this definition for quantity of information?

Tnx