Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Moments of the Poisson distribution

  1. Sep 2, 2012 #1

    fluidistic

    User Avatar
    Gold Member

    I cannot seem to get the first moment of Poisson's distribution with parameter a: [itex]P(n_1)=\frac{a^{n_1}e^{-a}}{n_1!}[/itex] when using the characteristic function [itex]\phi _X (k)=\exp [a(e^{ik}-1)][/itex].
    The definition of the first moment involving the characteristic function is [itex]<n_1>=\frac{i}{n} \frac{d \phi _X (k)}{dk} \big | _{k=0}[/itex].
    I get [itex]<n_1>=\frac{1}{i} a(e^{ik}-1)aie^{ik}e^{a(e^{ik}-1)} \big | _{k=0}=0[/itex] because of the factor [itex]e^{ik}-1\big | _{k=0}=1-1=0[/itex].
    However I should reach [itex]<n_1>=a[/itex].
    I really do not see what I did wrong. I do know that the characteristic function is OK, according to wikipedia and mathworld. I also do know I should reach that the mean or first moment is worth "a" but I'm getting 0.
    I've applied twice the chain rule and do not see any mistake, but obviously I'm doing at least one somewhere. Any help is welcome!
     
  2. jcsd
  3. Sep 3, 2012 #2

    chiro

    User Avatar
    Science Advisor

    Hey fluidistic.

    You are better off using a moment generating function and the differentiating once and setting the parameter t = 0 to get the moment.

    The definition of the MGF is MGF_X(t) = E[e^tX].
     
  4. Sep 3, 2012 #3

    fluidistic

    User Avatar
    Gold Member

    First of all, thanks for helping me.
    Yeah I know another way (probably the one you mention?) to get the result, namely "a".
    [itex]<n_1>=\sum _{n_1=0} ^ \infty n_1 \frac{a^{n_1}}{n_1!} e^{-a}=e^{-a} \sum _{n_1=0}^{ \infty } \frac{a^{n_1}}{(n_1-1)!}=e^{-a} \sum _{n_1=0}^{\infty } a \cdot \frac{a^{n-1}}{(n_1-1)!}=e^{-a}ae^a=a[/itex]. I've no problem with this result since it is the right answer.
    However what troubles me a lot is when I want to use the characteristic function. According to the book of Reichl's "A modern course in statistical physics", 2nd edition, page 189 it should be easy. Namely
    where I think he meant [itex]f_X (k)[/itex] (as it is definied earlier and employed further.).
    Using that given formula I could get the first moment of the binomial distribution. I'm losing my mind at understanding what's wrong with what I've done for the Poisson's distribution. It's not that I don't know how to apply the formula, it's that it gives me 0 no matter what instead of "a".
     
  5. Sep 3, 2012 #4

    chiro

    User Avatar
    Science Advisor

    The MGF and PDF are directly related to the characteristic function.

    You might want to look at this relation (basically an inverse fourier transform type integral on the MGF over the real line gives back the PDF), and see the formula you have stated above relates to the actual MGF formula for getting moments.

    I think this will clear up a lot of problems for since you can calculate the MGF easily and you can see analytically how the characteristic transformation and the MGF are both related to the PDF.
     
  6. Sep 4, 2012 #5

    Mute

    User Avatar
    Homework Helper

    It looks like you made a mistake applying the chain rule.
    $$\frac{d}{dk} \exp(f(k)) = \exp(f(k)) \frac{df(k)}{dk};$$
    with ##f(k) = a(\exp(ik)-1)## you get ##df/dk = ai \exp(ik)##. The -1 drops out, so the moment doesn't vanish, and you get ##\langle n_1 \rangle = (1/i)(ai) = a##, as expected.

    The inverse fourier transform over the characteristic function gives the probability density function. Maybe an analagous formula holds for getting the pdf from the moment generating function by manipulating the integral with contour methods (giving some sort of inverse Laplace transform, perhaps), but I haven't seen such a formula in use before.
     
    Last edited: Sep 4, 2012
  7. Sep 4, 2012 #6

    chiro

    User Avatar
    Science Advisor

    It's a very simple relationship:

    http://en.wikipedia.org/wiki/Moment-generating_function#Other_properties
     
  8. Sep 4, 2012 #7

    Mute

    User Avatar
    Homework Helper

    Yes, I am aware of the relationship between the moment-generating function and the characteristic function.

    What I was saying was the pdf is obtained directly by inverse fourier transforming the characteristic function, not the moment-generating function. You can of course use the relation ##\varphi_X(t) = M_X(it)## to cosmetically rewrite the inverse formula in terms of the the moment-generating function of imaginary argument, but if you're going to stop there you might as well just keep things in terms of the characteristic function.

    What I was getting at was that if you perform a rotation of the variables in the complex plane such that ##M_X(it) \rightarrow M_X(t)##, then the inverse formula is no longer an inverse fourier transform, but rather an inverse Laplace transform, or something like it (there are perhaps some subtleties there, I haven't thought through it carefully, but naively it looks like you'd get an inverse Laplace transform). I was commenting that I haven't seen anyone write down the pdf in terms of an inverse Laplace transform over the moment-generating function, perhaps because it's easier to just inverse fourier transform the characteristic function.

    My goal here was to be precise so that the OP doesn't try to inverse fourier transform ##M_X(t)## rather than ##\varphi_X(t) = M_X(it)## and then wonder why he's not getting the right answer.
     
  9. Sep 4, 2012 #8
    There is no pdf for the Poisson distribution. As a discrete distribution, it is not differentiable.
     
    Last edited: Sep 4, 2012
  10. Sep 4, 2012 #9

    Mute

    User Avatar
    Homework Helper

    Yes, in the spirit of being precise with terminology, there of course there is no pdf, but the probability mass function is still given by the inverse fourier transform.

    For example, inverse fourier transforming the characteristic function of the Poisson distribution,

    $$\int_{-\infty}^\infty \frac{dt}{2\pi} e^{\lambda(\exp(it)-1)} e^{-ikt},$$

    let's change variables to the complex variable ##z = \exp(it)##. The integral can be written as the following contour integral in the complex plane, which we can then evaluate via the residue theorem, but we are forced to take ##k## to be an integer (as expected):

    $$\frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
    which is, of course, the probability mass function of the Poisson distribution. Here the contour ##\gamma## is a closed contour from ##-R## to ##R## along the real line, closed by a circular arc of radius ##R##, taking the ##R \rightarrow \infty## limit. The contribution from the arc will vanish in this limit, demonstrating equality between the contour integral and the original fourier integral.

    At any rate, the discussion of inverse transforms of characteristic functions to get pdf's (or pmf's) was tangential to the OP's original question, and we should probably not lead the discussion too far off topic.
     
  11. Sep 4, 2012 #10

    fluidistic

    User Avatar
    Gold Member

    Thank you very, very much. I totally overlooked this even for the 10th times of rechecking.
    I feel great now.
    Thank you guys for all the help and insights.
     
  12. Sep 4, 2012 #11

    Mute

    User Avatar
    Homework Helper

    Sometimes no matter how much you stare at something your brain just can't notice the mistake you made! It happens to the best of us!

    I know I said we shouldn't focus on this too much, but I should be more careful than I was in that post, and correct some errors here:

    For discrete probability distributions, the characteristic function, defined as the expectation of ##\exp(ikt)##, can be viewed as a fourier series:

    $$\varphi_x(t) = \mathbb{E}[e^{ikt}] = \sum_{k=-\infty}^\infty P(k)e^{ikt},$$
    (where P(k) may be zero for some k).

    This means that P(k) can be found via

    $$P(k) = \int_{-\pi}^\pi \frac{dt}{2\pi} \varphi_X(t)e^{-ikt};$$
    note the difference from my previous post. Here, the inverse is over a finite interval ##[-\pi,\pi)##, rather than the real line, due to the periodicity of the characteristic function in the discrete case (hence why it is a fourier series of the probability mass function rather than a fourier transform of the probability density function).

    As such, the example I gave in my previous post should actually read

    $$\int_{-\pi}^\pi \frac{dt}{2\pi}e^{\lambda(\exp(it)-1)} e^{-ikt} = \frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
    where the contour ##\gamma## is the full circle ##|z| = 1##, not the infinite semi-circle in the incorrect derivation.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Moments of the Poisson distribution
  1. Poisson distribution (Replies: 2)

  2. Poisson Distribution (Replies: 4)

  3. Poisson distribution (Replies: 1)

  4. Poisson Distribution (Replies: 1)

  5. Poisson distribution (Replies: 2)

Loading...