Poisson distribution with conditional probability

Click For Summary

Discussion Overview

The discussion revolves around computing conditional probabilities within the context of a Poisson distribution. Participants explore the implications of the memorylessness property and its applicability to Poisson distributions, particularly in calculating probabilities like P(X > x1 | X > x2) where x1 > x2.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions how to compute P(X > x1 | X > x2) for a Poisson distribution and whether the memorylessness property applies.
  • Another participant suggests that if x1 > x2, then P(X > x1 ∩ X > x2) should equal P(X > x1), but seeks confirmation.
  • A participant asserts that events in a Poisson distribution are independent, yet challenges the independence of specific probabilities like P(X >= 70) and P(X >= 80), suggesting that knowing at least 70 events have occurred affects the probability of at least 80 events occurring.
  • Further clarification is provided regarding the relationship between subsets of events and their probabilities, referencing Bayes' theorem in the context of conditional probabilities.
  • A participant cites a claim from a Wikipedia article about memorylessness, noting that only geometric distributions are memoryless among discrete probability distributions, which they find surprising.

Areas of Agreement / Disagreement

Participants express differing views on the independence of events in a Poisson distribution and the implications of memorylessness. The discussion remains unresolved regarding the application of these concepts to the specific conditional probabilities in question.

Contextual Notes

Participants reference the memorylessness property and its relation to different types of distributions, highlighting potential limitations in understanding how these properties apply to Poisson distributions. There is an ongoing exploration of the implications of conditional probabilities without reaching a consensus.

Woolyabyss
Messages
142
Reaction score
1
Hi guys,
I have a question about computing conditional probabilities of a Poisson distribution.
Say we have a Poisson distribution P(X = x) = e^(−λ)(λx)/(x!) where X is some event.
My question is how would we compute P(X > x1 | X > x2), or more specifically P(X> x1 ∩ X > x2) with x1 > x2?
I originally thought that P(X > x1 ∩ X > x2) = P(X > x1) but recently read about the memorylessness property of exponential distributions and I'm not sure if it applies to Poisson distributions.
 
Physics news on Phys.org
Woolyabyss said:
I originally thought that P(X > x1 ∩ X > x2) = P(X > x1)

If x1 > x2 then ##\{X: X > x1, X > x2\}## is the same event as ##\{X:X>x1\}## , isn't it?
 
Stephen Tashi said:
If x1 > x2 then ##\{X: X > x1, X > x2\}## is the same event as ##\{X:X>x1\}## , isn't it?
Yes. Also I know that the events X = x of a Poisson distribution are independent of one another but surely P(X >= 70) and P(X >= 80) for example can't be, because given at least 70 events happen, the probability that at least 80 events happen would be 10 no?
 
Woolyabyss said:
but surely P(X >= 70) and P(X >= 80) for example can't be, because given at least 70 events happen, the probability that at least 80 events happen would be 10 no?

But my remark wasn't about the independence of events. If ##A \subset B ## then ##Pr(A \cap B) = Pr(A)##.

As far as independence goes, in most cases if ##A \subset B## then ##A## and ##B## are not independent events. Exceptions would be cases like ##Pr(A) = Pr(B) = 0 ## or ##Pr(A) = Pr(B) = 1 ##.

To find ##Pr(X > 80 | X > 70)##, what does Bayes theorem tell you ?

In the current Wikipedia article on "Memorylessness" https://en.wikipedia.org/wiki/Memorylessness there is the interesting claim:

The only memoryless discrete probability distributions are the geometric distributions, which feature the number of independent Bernoulli trials needed to get one "success," with a fixed probability p of "success" on each trial. In other words those are the distributions of waiting time in a Bernoulli process.

- surprising (to me), if true.
 
  • Like
Likes   Reactions: Woolyabyss

Similar threads

  • · Replies 54 ·
2
Replies
54
Views
7K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K