SW VandeCarr
- 2,193
- 77
John Creighto said:For each value N can take, the new random variable would be log_2 (N) and have the same probability as that of N.
I think we are in agreement here. However take a look at my post 15 and see if you agree with that. The OP expressed satisfaction with it in post 17.
In any case I was taking the point of view that if I risked $1 to win $512 given a 1/1024 probability of guessing right, then the surprisal value of learning that I won would be 10 bits. To me surprisal, entropy and information all are essentially the same thing in this context. They are all calculated in the same way. If you read through the thread, it's clear that I agree a known value exists with P=1 and the information value is 0. However, there is something important (I believe) about first learning a result which is informative and perhaps surprising.
Last edited: