Calculating the Joint PMF of Two Independent Poisson Random Variables

  • Thread starter Thread starter chili237
  • Start date Start date
  • Tags Tags
    Joint
Click For Summary
SUMMARY

The discussion focuses on calculating the joint probability mass function (PMF) of two independent Poisson random variables, X and Y, where X follows a Poisson distribution with parameter λ and Y follows a Poisson distribution with parameter μ. The joint PMF, pX,X+Y(k,n), is derived from the individual PMFs of X and Y, utilizing the convolution theorem to address the dependency introduced by the sum X+Y. The conclusion emphasizes that the joint PMF can be computed by recognizing that X+Y is distributed as Poisson(λ+μ) and applying the properties of independent random variables.

PREREQUISITES
  • Understanding of Poisson distributions, specifically Poisson(λ) and Poisson(μ)
  • Familiarity with the concept of probability mass functions (PMFs)
  • Knowledge of the convolution theorem in probability theory
  • Basic skills in probability calculations and independence of random variables
NEXT STEPS
  • Study the convolution theorem in detail to understand its application in probability
  • Explore the properties of independent random variables and their PMFs
  • Learn about the derivation of joint distributions for independent random variables
  • Practice calculating joint PMFs using examples involving Poisson distributions
USEFUL FOR

Students and professionals in statistics, data science, and probability theory, particularly those working with Poisson processes and joint distributions.

chili237
Messages
3
Reaction score
0
X~Pois(λ)=> px(k)=e-λλk/k!

Y~Pois(μ)=> py(k)=e-μμk/k!

Find pX,X+Y(k,n)=P(X=k, X+Y=n)

...I know the pmf for X+Y ~ Pois(λ+μ)

As I understand the joint pmf for two independent random variables would be the product of the two individual pmfs. However as X+Y is dependent on X I got really stuck trying to think about this one and how to set it up.

Any help would be great. Thanks :)
 
Physics news on Phys.org
chili237 said:
X~Pois(λ)=> px(k)=e-λλk/k!

Y~Pois(μ)=> py(k)=e-μμk/k!

Find pX,X+Y(k,n)=P(X=k, X+Y=n)

...I know the pmf for X+Y ~ Pois(λ+μ)

As I understand the joint pmf for two independent random variables would be the product of the two individual pmfs. However as X+Y is dependent on X I got really stuck trying to think about this one and how to set it up.

Any help would be great. Thanks :)

The X+Y problem in general can be solved through the convolution theorem. The requirement is that all random variables in the summation (in this case they are X and Y but they could X,Y,Z,W as in X+Y+Z+W) be independent.

Do you know about the convolution theorem? If not have you made any attempts at the problem? If so could you please show them so we can help you.
 
I'm completely new to probability, so I'm learning as I go. The convolution theorem isn't something I know or have in any of my materials, but I'll do some research and hopefully that'll point me in the right direction.
 

Similar threads

Replies
1
Views
2K
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
Replies
1
Views
2K