Moments of the Poisson distribution

In summary: Answer, in summary, The first moment of Poisson's distribution with parameter a can be obtained using the characteristic function of the distribution. However, there was a mistake made when applying the chain rule, leading to a result of 0 instead of the expected value of a. The inverse Fourier transform of the characteristic function gives the probability density function, while the inverse Laplace transform of the moment-generating function may also give the pdf, but is not commonly used.
  • #1
fluidistic
Gold Member
3,923
261
I cannot seem to get the first moment of Poisson's distribution with parameter a: [itex]P(n_1)=\frac{a^{n_1}e^{-a}}{n_1!}[/itex] when using the characteristic function [itex]\phi _X (k)=\exp [a(e^{ik}-1)][/itex].
The definition of the first moment involving the characteristic function is [itex]<n_1>=\frac{i}{n} \frac{d \phi _X (k)}{dk} \big | _{k=0}[/itex].
I get [itex]<n_1>=\frac{1}{i} a(e^{ik}-1)aie^{ik}e^{a(e^{ik}-1)} \big | _{k=0}=0[/itex] because of the factor [itex]e^{ik}-1\big | _{k=0}=1-1=0[/itex].
However I should reach [itex]<n_1>=a[/itex].
I really do not see what I did wrong. I do know that the characteristic function is OK, according to wikipedia and mathworld. I also do know I should reach that the mean or first moment is worth "a" but I'm getting 0.
I've applied twice the chain rule and do not see any mistake, but obviously I'm doing at least one somewhere. Any help is welcome!
 
Physics news on Phys.org
  • #2
Hey fluidistic.

You are better off using a moment generating function and the differentiating once and setting the parameter t = 0 to get the moment.

The definition of the MGF is MGF_X(t) = E[e^tX].
 
  • #3
First of all, thanks for helping me.
chiro said:
Hey fluidistic.

You are better off using a moment generating function and the differentiating once and setting the parameter t = 0 to get the moment.

The definition of the MGF is MGF_X(t) = E[e^tX].
Yeah I know another way (probably the one you mention?) to get the result, namely "a".
[itex]<n_1>=\sum _{n_1=0} ^ \infty n_1 \frac{a^{n_1}}{n_1!} e^{-a}=e^{-a} \sum _{n_1=0}^{ \infty } \frac{a^{n_1}}{(n_1-1)!}=e^{-a} \sum _{n_1=0}^{\infty } a \cdot \frac{a^{n-1}}{(n_1-1)!}=e^{-a}ae^a=a[/itex]. I've no problem with this result since it is the right answer.
However what troubles me a lot is when I want to use the characteristic function. According to the book of Reichl's "A modern course in statistical physics", 2nd edition, page 189 it should be easy. Namely
Reichl said:
Furthermore, if we know the characteristic function we can obtain moments by differentiating: [itex]<x^n>= \lim _{k\to 0} (-i)^n \frac{d^n f_X (x)}{dk^n}[/itex]
where I think he meant [itex]f_X (k)[/itex] (as it is definied earlier and employed further.).
Using that given formula I could get the first moment of the binomial distribution. I'm losing my mind at understanding what's wrong with what I've done for the Poisson's distribution. It's not that I don't know how to apply the formula, it's that it gives me 0 no matter what instead of "a".
 
  • #4
The MGF and PDF are directly related to the characteristic function.

You might want to look at this relation (basically an inverse Fourier transform type integral on the MGF over the real line gives back the PDF), and see the formula you have stated above relates to the actual MGF formula for getting moments.

I think this will clear up a lot of problems for since you can calculate the MGF easily and you can see analytically how the characteristic transformation and the MGF are both related to the PDF.
 
  • #5
fluidistic said:
I cannot seem to get the first moment of Poisson's distribution with parameter a: [itex]P(n_1)=\frac{a^{n_1}e^{-a}}{n_1!}[/itex] when using the characteristic function [itex]\phi _X (k)=\exp [a(e^{ik}-1)][/itex].
The definition of the first moment involving the characteristic function is [itex]<n_1>=\frac{i}{n} \frac{d \phi _X (k)}{dk} \big | _{k=0}[/itex].
I get [itex]<n_1>=\frac{1}{i} a(e^{ik}-1)aie^{ik}e^{a(e^{ik}-1)} \big | _{k=0}=0[/itex] because of the factor [itex]e^{ik}-1\big | _{k=0}=1-1=0[/itex].
However I should reach [itex]<n_1>=a[/itex].
I really do not see what I did wrong. I do know that the characteristic function is OK, according to wikipedia and mathworld. I also do know I should reach that the mean or first moment is worth "a" but I'm getting 0.
I've applied twice the chain rule and do not see any mistake, but obviously I'm doing at least one somewhere. Any help is welcome!

It looks like you made a mistake applying the chain rule.
$$\frac{d}{dk} \exp(f(k)) = \exp(f(k)) \frac{df(k)}{dk};$$
with ##f(k) = a(\exp(ik)-1)## you get ##df/dk = ai \exp(ik)##. The -1 drops out, so the moment doesn't vanish, and you get ##\langle n_1 \rangle = (1/i)(ai) = a##, as expected.

chiro said:
The MGF and PDF are directly related to the characteristic function.

You might want to look at this relation (basically an inverse Fourier transform type integral on the MGF over the real line gives back the PDF), and see the formula you have stated above relates to the actual MGF formula for getting moments.

The inverse Fourier transform over the characteristic function gives the probability density function. Maybe an analagous formula holds for getting the pdf from the moment generating function by manipulating the integral with contour methods (giving some sort of inverse Laplace transform, perhaps), but I haven't seen such a formula in use before.
 
Last edited:
  • #6
Mute said:
The inverse Fourier transform over the characteristic function gives the probability density function. Maybe an analagous formula holds for getting the pdf from the moment generating function by manipulating the integral with contour methods (giving some sort of inverse Laplace transform, perhaps), but I haven't seen such a formula in use before.

It's a very simple relationship:

http://en.wikipedia.org/wiki/Moment-generating_function#Other_properties
 
  • #7
chiro said:

Yes, I am aware of the relationship between the moment-generating function and the characteristic function.

What I was saying was the pdf is obtained directly by inverse Fourier transforming the characteristic function, not the moment-generating function. You can of course use the relation ##\varphi_X(t) = M_X(it)## to cosmetically rewrite the inverse formula in terms of the the moment-generating function of imaginary argument, but if you're going to stop there you might as well just keep things in terms of the characteristic function.

What I was getting at was that if you perform a rotation of the variables in the complex plane such that ##M_X(it) \rightarrow M_X(t)##, then the inverse formula is no longer an inverse Fourier transform, but rather an inverse Laplace transform, or something like it (there are perhaps some subtleties there, I haven't thought through it carefully, but naively it looks like you'd get an inverse Laplace transform). I was commenting that I haven't seen anyone write down the pdf in terms of an inverse Laplace transform over the moment-generating function, perhaps because it's easier to just inverse Fourier transform the characteristic function.

My goal here was to be precise so that the OP doesn't try to inverse Fourier transform ##M_X(t)## rather than ##\varphi_X(t) = M_X(it)## and then wonder why he's not getting the right answer.
 
  • #8
Mute said:
I was commenting that I haven't seen anyone write down the pdf in terms of an inverse Laplace transform over the moment-generating function, perhaps because it's easier to just inverse Fourier transform the characteristic function.

My goal here was to be precise so that the OP doesn't try to inverse Fourier transform ##M_X(t)## rather than ##\varphi_X(t) = M_X(it)## and then wonder why he's not getting the right answer.

There is no pdf for the Poisson distribution. As a discrete distribution, it is not differentiable.
 
Last edited:
  • #9
SW VandeCarr said:
There is no pdf for the Poisson distribution. As a discrete distribution, it is not differentiable.

Yes, in the spirit of being precise with terminology, there of course there is no pdf, but the probability mass function is still given by the inverse Fourier transform.

For example, inverse Fourier transforming the characteristic function of the Poisson distribution,

$$\int_{-\infty}^\infty \frac{dt}{2\pi} e^{\lambda(\exp(it)-1)} e^{-ikt},$$

let's change variables to the complex variable ##z = \exp(it)##. The integral can be written as the following contour integral in the complex plane, which we can then evaluate via the residue theorem, but we are forced to take ##k## to be an integer (as expected):

$$\frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
which is, of course, the probability mass function of the Poisson distribution. Here the contour ##\gamma## is a closed contour from ##-R## to ##R## along the real line, closed by a circular arc of radius ##R##, taking the ##R \rightarrow \infty## limit. The contribution from the arc will vanish in this limit, demonstrating equality between the contour integral and the original Fourier integral.

At any rate, the discussion of inverse transforms of characteristic functions to get pdf's (or pmf's) was tangential to the OP's original question, and we should probably not lead the discussion too far off topic.
 
  • #10
Mute said:
It looks like you made a mistake applying the chain rule.
$$\frac{d}{dk} \exp(f(k)) = \exp(f(k)) \frac{df(k)}{dk};$$
with ##f(k) = a(\exp(ik)-1)## you get ##df/dk = ai \exp(ik)##. The -1 drops out, so the moment doesn't vanish, and you get ##\langle n_1 \rangle = (1/i)(ai) = a##, as expected.
Thank you very, very much. I totally overlooked this even for the 10th times of rechecking.
I feel great now.
Thank you guys for all the help and insights.
 
  • #11
fluidistic said:
Thank you very, very much. I totally overlooked this even for the 10th times of rechecking.
I feel great now.
Thank you guys for all the help and insights.

Sometimes no matter how much you stare at something your brain just can't notice the mistake you made! It happens to the best of us!

Mute said:
Yes, in the spirit of being precise with terminology, there of course there is no pdf, but the probability mass function is still given by the inverse Fourier transform.

For example, inverse Fourier transforming the characteristic function of the Poisson distribution,

$$\int_{-\infty}^\infty \frac{dt}{2\pi} e^{\lambda(\exp(it)-1)} e^{-ikt},$$

let's change variables to the complex variable ##z = \exp(it)##. The integral can be written as the following contour integral in the complex plane, which we can then evaluate via the residue theorem, but we are forced to take ##k## to be an integer (as expected):

$$\frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
which is, of course, the probability mass function of the Poisson distribution. Here the contour ##\gamma## is a closed contour from ##-R## to ##R## along the real line, closed by a circular arc of radius ##R##, taking the ##R \rightarrow \infty## limit. The contribution from the arc will vanish in this limit, demonstrating equality between the contour integral and the original Fourier integral.

At any rate, the discussion of inverse transforms of characteristic functions to get pdf's (or pmf's) was tangential to the OP's original question, and we should probably not lead the discussion too far off topic.

I know I said we shouldn't focus on this too much, but I should be more careful than I was in that post, and correct some errors here:

For discrete probability distributions, the characteristic function, defined as the expectation of ##\exp(ikt)##, can be viewed as a Fourier series:

$$\varphi_x(t) = \mathbb{E}[e^{ikt}] = \sum_{k=-\infty}^\infty P(k)e^{ikt},$$
(where P(k) may be zero for some k).

This means that P(k) can be found via

$$P(k) = \int_{-\pi}^\pi \frac{dt}{2\pi} \varphi_X(t)e^{-ikt};$$
note the difference from my previous post. Here, the inverse is over a finite interval ##[-\pi,\pi)##, rather than the real line, due to the periodicity of the characteristic function in the discrete case (hence why it is a Fourier series of the probability mass function rather than a Fourier transform of the probability density function).

As such, the example I gave in my previous post should actually read

$$\int_{-\pi}^\pi \frac{dt}{2\pi}e^{\lambda(\exp(it)-1)} e^{-ikt} = \frac{1}{2\pi i}\oint_\gamma \frac{dz}{z^{k+1}} e^{\lambda z} e^{-\lambda} = \frac{1}{k!}\frac{d^k}{dz^k}\left[e^{\lambda z}\right]_{z=0} e^{-\lambda} = \frac{\lambda^k}{k!}e^{-\lambda},$$
where the contour ##\gamma## is the full circle ##|z| = 1##, not the infinite semi-circle in the incorrect derivation.
 

1. What is the Poisson distribution?

The Poisson distribution is a discrete probability distribution that is used to model the number of events that occur in a fixed time interval when the events occur independently and at a constant rate.

2. What is a moment in the context of the Poisson distribution?

In the context of the Poisson distribution, a moment is a statistical measure that quantifies the shape and location of the distribution. Moments can be used to calculate the mean, variance, and higher order moments of the Poisson distribution.

3. How many moments are there in the Poisson distribution?

The Poisson distribution has an infinite number of moments, but typically only the first two moments (mean and variance) are used to describe the distribution.

4. What is the relationship between moments and the shape of the Poisson distribution?

The moments of the Poisson distribution can be used to calculate the skewness and kurtosis, which describe the asymmetry and peakedness of the distribution, respectively. Higher order moments can also provide information about the shape of the distribution.

5. How are moments of the Poisson distribution calculated?

The moments of the Poisson distribution can be calculated using the formula: E[Xn] = λn, where λ is the mean or rate parameter of the distribution. This formula can be used to calculate any moment of the Poisson distribution, but it is most commonly used to calculate the first two moments (mean and variance).

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
846
Replies
11
Views
349
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • General Math
Replies
1
Views
238
Back
Top