Approximation of skellam distribution by a Gaussian one

Click For Summary
The discussion focuses on approximating the Skellam distribution, which arises from the difference of two Poisson random variables, using a Gaussian distribution. The Skellam distribution has a mean of λ1 - λ2 and a variance of λ1 + λ2, making it complex to analyze. For large values of λ, a Gaussian approximation is effective, but challenges arise when λ is small. The conversation suggests using maximum likelihood estimation to find parameters for the approximating distribution, emphasizing the need for careful consideration in cases of small λ. Overall, while Gaussian may work in certain scenarios, alternative approaches may be necessary for more general cases.
sabbagh80
Messages
38
Reaction score
0
Hi, everybody

Let n_1 ~ Poisson (\lambda_1) and n_2 ~ Poisson (\lambda_2).
Now define n=n_1-n_2. We know n has "Skellam distribution" with mean \lambda_1-\lambda_2 and variance \lambda_1+\lambda_2, which is not easy to deal with.
I want to find the Pr(n \geq 0). Is it possible to find a good approximation for the above probability by employing an approximated "Gaussian distribution"? If "Gaussian" is not a good candidate, which distribution can I replace it with?
 
Physics news on Phys.org
If at least one of the lambdas is large, the Gaussian with the same mean and variance will be a good approximation.
 
But it is not always the case. I want to deal with the more general cases.
 
Yes, I know. But for small lambda, I don't think there's any simpler approximation. Of course, you could in that case just truncate the distributions. If the mean is small, the probably of large n is vanishingly small, so it won't introduce much inaccuracy to leave them out.
 
To approximate one distribution with another use maximum likelihood, i.e. maximize
E[\log(f(X;t))
wrt the parameter vector t, where f is the pdf or pmf of the approximating distribution. E.g. solving for the normal distribution we get \mu=E[X] and \sigma^2=E[X^2]-E[X]^2.
 
bpet said:
To approximate one distribution with another use maximum likelihood, i.e. maximize
E[\log(f(X;t))]
wrt the parameter vector t, where f is the pdf or pmf of the approximating distribution. E.g. solving for the normal distribution we get \mu=E[X] and \sigma^2=E[X^2]-E[X]^2.

Could you please explain it in more details.
 
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
5K
Replies
1
Views
2K
Replies
8
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 14 ·
Replies
14
Views
6K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K