1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Help with probability/central limit thm.

  1. Jul 30, 2008 #1
    1. The problem statement, all variables and given/known data

    Using moment generating functions, show that as n --> infinity, p --> 0 and np --> lambda, the binomial distribution with parameters n and p tends to the Poisson distribution.

    2. Relevant equations

    We know the mgfs of Binomial and Poisson distributions are, respectively,
    M(t) = [p*exp(t) + 1 - p]^n and M(t) = exp[lambda*(exp(t)-1)].

    Also, relationships between expected values, variance, mgfs :

    E(X^2) - [E(X)]^2 = Var(X), for some random variable X
    M'x(0) = E(X), where I have the "subscript" xi to denote which mgf I'm referring to.

    Theorem: if the mgf exists for t in an open interval containing zero, it uniquely determines the probability distribution.
    Theorem: Let Fn be a sequence of cumulative distribution functions (CDFs, in our case Binomial and Poisson) with the corresponding moment generating function Mn. Let F be a CDF with mgf M. If Mn(t) --> M(t) for all t in an open interval containing zero, then Fn(x) --> F(x) at all continuity points of F.

    3. The attempt at a solution
    Ok, I had an idea, but I don't know if this is correct. Maybe someone can tell me?

    Let {Xi} (with i= 1...infinity) be a sequence of Binomial random variables with parameters n and p. We know that for each i, the expected value and variance are given by

    E(Xi) = np and Var(Xi) = np(1-p). Now from the problem statement we have that np --> lambda and p --> 0 while n --> infinity (I'm not going to question it anymore, just going to use it).

    Then it must be that, for each i, E(Xi) --> lambda and Var(Xi) --> lambda. But we also know that for a random variable P following a Poisson distribution that E(P) = lambda and Var(P) = lambda. We also know that these probability distributions are uniquely determined by their moment generating functions on some open interval containing zero. Thus, (letting primes denote derivatives with respect to time)

    M'xi(0) = E(Xi) --> lambda = E(P) = M'p(0) and similarly,

    M''xi(0) = E(Xi^2) = Var(Xi) + [E(Xi)]^2 --> lambda + lambda^2 = Var(P) + [E(P)]^2 = E(P^2) = M''p(0).

    Now here is where I am really feeling sketchy -- since mgfs UNIQUELY determine probability distributions (I can quote a theorem on the homework), and we have mgfs that exist for all t in some open interval around 0, it must be the case that Mxi(t) --> Mp(t).

    That's all I really would have to show, since I have the second theorem I listed above. But I'm not sure if my last stretch is valid, or actually, if the whole thing is strong enough.

    Any ideas would be SOO great. We just began lecturing on this chapter today and our final exam is Friday (summer terms are so short, they just pack it in...)...

    Last edited: Jul 30, 2008
  2. jcsd
  3. Jul 30, 2008 #2


    User Avatar
    Science Advisor
    Homework Helper

    This has got to remind you of lim n->infinity (1+x/n)^n=e^x, right? BTW I think you should just say np=lambda, for simplicity.
  4. Jul 31, 2008 #3
    Ah yes, it is sometimes too easy to remember these important limits. I suppose after some algebra it would be very easy to end up with this very limit, and once I have e^x, I suppose we'd be in Poisson land. That's actually not so bad at all! Thanks!
  5. Aug 1, 2008 #4


    User Avatar
    Science Advisor
    Homework Helper

    So you got it?
  6. Aug 1, 2008 #5
    I think so!! I actually was looking at very similar problem, where we can compute the MGF of a continuous random variable, then take a Taylor series expansion of the MGF - then, after some rearranging - the higher order terms --> 0 as n --> infinity, but that left me with (1 + lambda/n)^n and as n -- > infinity, this --> e^lambda. I sometimes fail to realize I can use Taylor series... :)
  7. Aug 1, 2008 #6
    Certainly it seems much more rigorous to expand an MGF, then take limits than what I typed in the first box. I thought that was a bit sketchy, didn't quite do it.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Help with probability/central limit thm.
  1. Central Limit Theorem (Replies: 2)