Moments of the Poisson distribution

  • Thread starter fluidistic
  • Start date
  • #1
fluidistic
Gold Member
3,661
103

Main Question or Discussion Point

I cannot seem to get the first moment of Poisson's distribution with parameter a: [itex]P(n_1)=\frac{a^{n_1}e^{-a}}{n_1!}[/itex] when using the characteristic function [itex]\phi _X (k)=\exp [a(e^{ik}-1)][/itex].
The definition of the first moment involving the characteristic function is [itex]<n_1>=\frac{i}{n} \frac{d \phi _X (k)}{dk} \big | _{k=0}[/itex].
I get [itex]<n_1>=\frac{1}{i} a(e^{ik}-1)aie^{ik}e^{a(e^{ik}-1)} \big | _{k=0}=0[/itex] because of the factor [itex]e^{ik}-1\big | _{k=0}=1-1=0[/itex].
However I should reach [itex]<n_1>=a[/itex].
I really do not see what I did wrong. I do know that the characteristic function is OK, according to wikipedia and mathworld. I also do know I should reach that the mean or first moment is worth "a" but I'm getting 0.
I've applied twice the chain rule and do not see any mistake, but obviously I'm doing at least one somewhere. Any help is welcome!
 

Answers and Replies

  • #2
chiro
Science Advisor
4,790
131
Hey fluidistic.

You are better off using a moment generating function and the differentiating once and setting the parameter t = 0 to get the moment.

The definition of the MGF is MGF_X(t) = E[e^tX].
 
  • #3
fluidistic
Gold Member
3,661
103
First of all, thanks for helping me.
Hey fluidistic.

You are better off using a moment generating function and the differentiating once and setting the parameter t = 0 to get the moment.

The definition of the MGF is MGF_X(t) = E[e^tX].
Yeah I know another way (probably the one you mention?) to get the result, namely "a".
[itex]<n_1>=\sum _{n_1=0} ^ \infty n_1 \frac{a^{n_1}}{n_1!} e^{-a}=e^{-a} \sum _{n_1=0}^{ \infty } \frac{a^{n_1}}{(n_1-1)!}=e^{-a} \sum _{n_1=0}^{\infty } a \cdot \frac{a^{n-1}}{(n_1-1)!}=e^{-a}ae^a=a[/itex]. I've no problem with this result since it is the right answer.
However what troubles me a lot is when I want to use the characteristic function. According to the book of Reichl's "A modern course in statistical physics", 2nd edition, page 189 it should be easy. Namely
Reichl said:
Furthermore, if we know the characteristic function we can obtain moments by differentiating: [itex]<x^n>= \lim _{k\to 0} (-i)^n \frac{d^n f_X (x)}{dk^n}[/itex]
where I think he meant [itex]f_X (k)[/itex] (as it is definied earlier and employed further.).
Using that given formula I could get the first moment of the binomial distribution. I'm losing my mind at understanding what's wrong with what I've done for the Poisson's distribution. It's not that I don't know how to apply the formula, it's that it gives me 0 no matter what instead of "a".
 
  • #4
chiro
Science Advisor
4,790
131
The MGF and PDF are directly related to the characteristic function.

You might want to look at this relation (basically an inverse fourier transform type integral on the MGF over the real line gives back the PDF), and see the formula you have stated above relates to the actual MGF formula for getting moments.

I think this will clear up a lot of problems for since you can calculate the MGF easily and you can see analytically how the characteristic transformation and the MGF are both related to the PDF.
 
  • #5
Mute
Homework Helper
1,388
10
I cannot seem to get the first moment of Poisson's distribution with parameter a: [itex]P(n_1)=\frac{a^{n_1}e^{-a}}{n_1!}[/itex] when using the characteristic function [itex]\phi _X (k)=\exp [a(e^{ik}-1)][/itex].
The definition of the first moment involving the characteristic function is [itex]<n_1>=\frac{i}{n} \frac{d \phi _X (k)}{dk} \big | _{k=0}[/itex].
I get [itex]<n_1>=\frac{1}{i} a(e^{ik}-1)aie^{ik}e^{a(e^{ik}-1)} \big | _{k=0}=0[/itex] because of the factor [itex]e^{ik}-1\big | _{k=0}=1-1=0[/itex].
However I should reach [itex]<n_1>=a[/itex].
I really do not see what I did wrong. I do know that the characteristic function is OK, according to wikipedia and mathworld. I also do know I should reach that the mean or first moment is worth "a" but I'm getting 0.
I've applied twice the chain rule and do not see any mistake, but obviously I'm doing at least one somewhere. Any help is welcome!
It looks like you made a mistake applying the chain rule.
$$\frac{d}{dk} \exp(f(k)) = \exp(f(k)) \frac{df(k)}{dk};$$
with ##f(k) = a(\exp(ik)-1)## you get ##df/dk = ai \exp(ik)##. The -1 drops out, so the moment doesn't vanish, and you get ##\langle n_1 \rangle = (1/i)(ai) = a##, as expected.

The MGF and PDF are directly related to the characteristic function.

You might want to look at this relation (basically an inverse fourier transform type integral on the MGF over the real line gives back the PDF), and see the formula you have stated above relates to the actual MGF formula for getting moments.
The inverse fourier transform over the characteristic function gives the probability density function. Maybe an analagous formula holds for getting the pdf from the moment generating function by manipulating the integral with contour methods (giving some sort of inverse Laplace transform, perhaps), but I haven't seen such a formula in use before.
 
Last edited:
  • #6
chiro
Science Advisor
4,790
131
The inverse fourier transform over the characteristic function gives the probability density function. Maybe an analagous formula holds for getting the pdf from the moment generating function by manipulating the integral with contour methods (giving some sort of inverse Laplace transform, perhaps), but I haven't seen such a formula in use before.
It's a very simple relationship:

http://en.wikipedia.org/wiki/Moment-generating_function#Other_properties
 
  • #7
Mute
Homework Helper
1,388
10
Yes, I am aware of the relationship between the moment-generating function and the characteristic function.

What I was saying was the pdf is obtained directly by inverse fourier transforming the characteristic function, not the moment-generating function. You can of course use the relation ##\varphi_X(t) = M_X(it)## to cosmetically rewrite the inverse formula in terms of the the moment-generating function of imaginary argument, but if you're going to stop there you might as well just keep things in terms of the characteristic function.

What I was getting at was that if you perform a rotation of the variables in the complex plane such that ##M_X(it) \rightarrow M_X(t)##, then the inverse formula is no longer an inverse fourier transform, but rather an inverse Laplace transform, or something like it (there are perhaps some subtleties there, I haven't thought through it carefully, but naively it looks like you'd get an inverse Laplace transform). I was commenting that I haven't seen anyone write down the pdf in terms of an inverse Laplace transform over the moment-generating function, perhaps because it's easier to just inverse fourier transform the characteristic function.

My goal here was to be precise so that the OP doesn't try to inverse fourier transform ##M_X(t)## rather than ##\varphi_X(t) = M_X(it)## and then wonder why he's not getting the right answer.
 
  • #8
2,123
79
I was commenting that I haven't seen anyone write down the pdf in terms of an inverse Laplace transform over the moment-generating function, perhaps because it's easier to just inverse fourier transform the characteristic function.

My goal here was to be precise so that the OP doesn't try to inverse fourier transform ##M_X(t)## rather than ##\varphi_X(t) = M_X(it)## and then wonder why he's not getting the right answer.
There is no pdf for the Poisson distribution. As a discrete distribution, it is not differentiable.
 
Last edited:
  • #9
Mute
Homework Helper
1,388
10
There is no pdf for the Poisson distribution. As a discrete distribution, it is not differentiable.
Yes, in the spirit of being precise with terminology, there of course there is no pdf, but the probability mass function is still given by the inverse fourier transform.

For example, inverse fourier transforming the characteristic function of the Poisson distribution,

$$\int_{-\infty}^\infty \frac{dt}{2\pi} e^{\lambda(\exp(it)-1)} e^{-ikt},$$

let's change variables to the complex variable ##z = \exp(it)##. The integral can be written as the following contour integral in the complex plane, which we can then evaluate via the residue theorem, but we are forced to take ##k## to be an integer (as expected):

$$\frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
which is, of course, the probability mass function of the Poisson distribution. Here the contour ##\gamma## is a closed contour from ##-R## to ##R## along the real line, closed by a circular arc of radius ##R##, taking the ##R \rightarrow \infty## limit. The contribution from the arc will vanish in this limit, demonstrating equality between the contour integral and the original fourier integral.

At any rate, the discussion of inverse transforms of characteristic functions to get pdf's (or pmf's) was tangential to the OP's original question, and we should probably not lead the discussion too far off topic.
 
  • #10
fluidistic
Gold Member
3,661
103
It looks like you made a mistake applying the chain rule.
$$\frac{d}{dk} \exp(f(k)) = \exp(f(k)) \frac{df(k)}{dk};$$
with ##f(k) = a(\exp(ik)-1)## you get ##df/dk = ai \exp(ik)##. The -1 drops out, so the moment doesn't vanish, and you get ##\langle n_1 \rangle = (1/i)(ai) = a##, as expected.
Thank you very, very much. I totally overlooked this even for the 10th times of rechecking.
I feel great now.
Thank you guys for all the help and insights.
 
  • #11
Mute
Homework Helper
1,388
10
Thank you very, very much. I totally overlooked this even for the 10th times of rechecking.
I feel great now.
Thank you guys for all the help and insights.
Sometimes no matter how much you stare at something your brain just can't notice the mistake you made! It happens to the best of us!

Yes, in the spirit of being precise with terminology, there of course there is no pdf, but the probability mass function is still given by the inverse fourier transform.

For example, inverse fourier transforming the characteristic function of the Poisson distribution,

$$\int_{-\infty}^\infty \frac{dt}{2\pi} e^{\lambda(\exp(it)-1)} e^{-ikt},$$

let's change variables to the complex variable ##z = \exp(it)##. The integral can be written as the following contour integral in the complex plane, which we can then evaluate via the residue theorem, but we are forced to take ##k## to be an integer (as expected):

$$\frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
which is, of course, the probability mass function of the Poisson distribution. Here the contour ##\gamma## is a closed contour from ##-R## to ##R## along the real line, closed by a circular arc of radius ##R##, taking the ##R \rightarrow \infty## limit. The contribution from the arc will vanish in this limit, demonstrating equality between the contour integral and the original fourier integral.

At any rate, the discussion of inverse transforms of characteristic functions to get pdf's (or pmf's) was tangential to the OP's original question, and we should probably not lead the discussion too far off topic.
I know I said we shouldn't focus on this too much, but I should be more careful than I was in that post, and correct some errors here:

For discrete probability distributions, the characteristic function, defined as the expectation of ##\exp(ikt)##, can be viewed as a fourier series:

$$\varphi_x(t) = \mathbb{E}[e^{ikt}] = \sum_{k=-\infty}^\infty P(k)e^{ikt},$$
(where P(k) may be zero for some k).

This means that P(k) can be found via

$$P(k) = \int_{-\pi}^\pi \frac{dt}{2\pi} \varphi_X(t)e^{-ikt};$$
note the difference from my previous post. Here, the inverse is over a finite interval ##[-\pi,\pi)##, rather than the real line, due to the periodicity of the characteristic function in the discrete case (hence why it is a fourier series of the probability mass function rather than a fourier transform of the probability density function).

As such, the example I gave in my previous post should actually read

$$\int_{-\pi}^\pi \frac{dt}{2\pi}e^{\lambda(\exp(it)-1)} e^{-ikt} = \frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
where the contour ##\gamma## is the full circle ##|z| = 1##, not the infinite semi-circle in the incorrect derivation.
 

Related Threads on Moments of the Poisson distribution

  • Last Post
Replies
2
Views
3K
Replies
1
Views
2K
  • Last Post
Replies
4
Views
3K
  • Last Post
Replies
2
Views
1K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
1
Views
2K
Replies
5
Views
5K
Replies
2
Views
4K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
1
Views
4K
Top