MHB How Do You Find the MLE of Lambda in a Sum of Two Poisson Distributions?

Click For Summary
To find the maximum likelihood estimator (MLE) of lambda (L) in a sum of two Poisson distributions, the parameters are defined as X~Poisson(nL) and Y~Poisson(mL), where m and n are natural numbers. The sum S = aX + bY involves real constants a and b. The key point is that the sum of two independent Poisson variables results in another Poisson variable with a parameter equal to the sum of the individual parameters, specifically (m+n)L. The challenge lies in determining the probability mass function (pmf) for S to compute the MLE effectively. Understanding the properties of Poisson distributions is crucial for solving this problem.
JGalway
Messages
6
Reaction score
0
First of all I will use L to denote lambda the parameter of the distribution.
X~Poission(nL), n$\in\Bbb{N}$,
Y~Poisson(mL),m$\in\Bbb{N}$ with m$\ne$n
S= aX+bY a,b real constants.
Given observations x and y find the maximum likelihood estimator of L.

The problem is I don't know what the pmf is for S which as far as I know you need to get the MLE.
Thanks in advance for any feedback.
 
Physics news on Phys.org
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
Replies
2
Views
1K
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
11K
  • · Replies 10 ·
Replies
10
Views
49K