Can probability zero events convey infinite information?

  • Context: Graduate 
  • Thread starter Thread starter The UPC P
  • Start date Start date
  • Tags Tags
    Base Bits Logarithm
Click For Summary

Discussion Overview

The discussion revolves around the concept of information theory as it relates to probability, particularly focusing on events with probability zero and their implications for information content. Participants explore theoretical aspects, mathematical reasoning, and conceptual clarifications regarding how probabilities relate to information encoding.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that the amount of information in an event can be quantified as ##I = -\log_2(p)##, where ##p## is the probability of the event occurring.
  • It is noted that as the probability of an event decreases, the information contained in that event increases, leading to the idea that an event with probability zero could convey infinite information.
  • One participant questions the validity of probabilities being other than real numbers within the range [0,1], presenting hypothetical scenarios that challenge conventional definitions of probability.
  • Another participant asserts that a probability of 1 means an event will certainly occur, and thus an event cannot have probability 1 and not happen.
  • There is a discussion about the distinction between probability zero and events that cannot happen, with an example involving a dart hitting a point on a dartboard illustrating how probability zero can still convey infinite information.
  • Some participants express skepticism about the practical occurrence of events with probability zero conveying infinite information in real-world scenarios.

Areas of Agreement / Disagreement

Participants express differing views on the interpretation of probabilities, particularly regarding the implications of probability zero and the nature of information conveyed by such events. There is no consensus on the validity of hypothetical scenarios presented, and the discussion remains unresolved.

Contextual Notes

Some arguments rely on unconventional interpretations of probability and information, which may not align with standard definitions. The discussion includes speculative scenarios that challenge established concepts without reaching definitive conclusions.

The UPC P
Messages
9
Reaction score
0
I know that if you have x states then you need log2(x) bits to encode them. For example a coin has 2 states and you need 1 bit which is log2(2). It also works for numbers between 0 and 1 for example if you halve the amount of states you need to add log2(1/2) bits which is -1.

So what does log2(i) mean? How can you have i states and encode them in log2(i) bits?

Also on a related note why does it require -infinite bits to encode 0 states?
 
Physics news on Phys.org
Actually, we can quantify the amount of information in an event (the bits we need) as ##I = -\log_2(p)## where ##p## is the probability of that event occurring. Notice that the less likely an event, the more information contained in it.

Thus, the amount of information required to encode the result of a coin flip is ##-\log_2(1/2)## because the probability of an event (heads, for instance) is ##1/2##.

It makes no sense to evaluate it at ##p = i##, because probabilities must be real numbers on ##[0,1]##.

You get an infinite amount of information in an event if the probability of that event occurring is ##0##.
 
  • Like
Likes   Reactions: The UPC P
Why can probabilities not be other numbers? For example if something was guaranteed to happen twice it's probability of happening once would be 2 so when it happens the information would be -log2(2) which means that when you see an event that you know is going to happen twice happening then you actually lose one bit of information.

Here is another example of my question: If some event has probability of 1 but then the opposite happens then the event that occurred had -1 probability (for example if I think that I am going to lose my bike but then instead I get an extra bike then I though my bikes would go -1 with 1 probibility but instead they went -1 with -1 probability and since -1*-1=1 I got an extra bike) so the information in it is -log2(-1) = - 4.53236014 i which means I actually lost an imaginary amount of bits.
 
No, that isn't how probability works. A basic property of a probability function ##P## is that ##0\leq P(A) \leq 1## for any event ##A##.

If something were guaranteed to happen twice, then the probability of it happening at least once is still 1.

Also, an event cannot have probability 1 and not happen. By definition probability 1 means the event will happen.
 
Last edited:
And what about probability 0? Can probability 0 happen so that one gets infinite information?
 
Yes, probability 0 indicates that the event in question will never happen, so that there is an infinite amount of information contained in that event (though I doubt this ever shows up in the real world).

An example of information in events:
There's more information in the statement "it snowed in Miami on July 4" than there is in the statement "it snowed in New York City on December 25," because the former is much less likely, and the latter is practically a given.
 
The UPC P said:
And what about probability 0? Can probability 0 happen so that one gets infinite information?
There is a distinction that can be drawn between probability zero and "cannot happen". One article that discusses this is https://en.wikipedia.org/wiki/Almost_surely

For instance, if you throw a dart at an ideal dartboard, the probability that it hits at any particular chosen point is zero. Yet it must strike at some point. If you were to write down the x and y coordinates at which it strikes in binary then you would get a pair of unending binary strings. Choose the point of impact just right and one of these could match a .pdf of Encylopedia Brittanica and the other could be an ASCII rendition of the complete works of William Shakespeare.

In that sense, the impact point of an ideal dart conveys infinite information. [Real darts in the real world do not make that much information available]
 
  • Like
Likes   Reactions: The UPC P and axmls

Similar threads

Replies
3
Views
2K
  • · Replies 31 ·
2
Replies
31
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K