(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Using moment generating functions, show that as n --> infinity, p --> 0 and np --> lambda, the binomial distribution with parameters n and p tends to the Poisson distribution.

2. Relevant equations

We know the mgfs of Binomial and Poisson distributions are, respectively,

M(t) = [p*exp(t) + 1 - p]^n and M(t) = exp[lambda*(exp(t)-1)].

Also, relationships between expected values, variance, mgfs :

E(X^2) - [E(X)]^2 = Var(X), for some random variable X

M'x(0) = E(X), where I have the "subscript" xi to denote which mgf I'm referring to.

Theorem: if the mgf exists for t in an open interval containing zero, it uniquely determines the probability distribution.

Theorem: Let Fn be a sequence of cumulative distribution functions (CDFs, in our case Binomial and Poisson) with the corresponding moment generating function Mn. Let F be a CDF with mgf M. If Mn(t) --> M(t) for all t in an open interval containing zero, then Fn(x) --> F(x) at all continuity points of F.

3. The attempt at a solution

Ok, I had an idea, but I don't know if this is correct. Maybe someone can tell me?

Let {Xi} (with i= 1...infinity) be a sequence of Binomial random variables with parameters n and p. We know that for each i, the expected value and variance are given by

E(Xi) = np and Var(Xi) = np(1-p). Now from the problem statement we have that np --> lambda and p --> 0 while n --> infinity (I'm not going to question it anymore, just going to use it).

Then it must be that, for each i, E(Xi) --> lambda and Var(Xi) --> lambda. But we also know that for a random variable P following a Poisson distribution that E(P) = lambda and Var(P) = lambda. We also know that these probability distributions are uniquely determined by their moment generating functions on some open interval containing zero. Thus, (letting primes denote derivatives with respect to time)

M'xi(0) = E(Xi) --> lambda = E(P) = M'p(0) and similarly,

M''xi(0) = E(Xi^2) = Var(Xi) + [E(Xi)]^2 --> lambda + lambda^2 = Var(P) + [E(P)]^2 = E(P^2) = M''p(0).

Now here is where I am really feeling sketchy -- since mgfs UNIQUELY determine probability distributions (I can quote a theorem on the homework), and we have mgfs that exist for all t in some open interval around 0, it must be the case that Mxi(t) --> Mp(t).

That's all I really would have to show, since I have the second theorem I listed above. But I'm not sure if my last stretch is valid, or actually, if the whole thing is strong enough.

Any ideas would be SOO great. We just began lecturing on this chapter today and our final exam is Friday (summer terms are so short, they just pack it in...)...

Thanks.

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Help with probability/central limit thm.

**Physics Forums | Science Articles, Homework Help, Discussion**