Explanation for moment-generating function

1. Sep 1, 2007

Hello everyone, I have taken up on reading a mathematical statistics book and have gotten stuck on the moment-generating function. I tried using Wikipedia for a simpler explanation to no avail....
I noticed it is used a lot in finding the mean & variance for all types of distributions. Can someone explain to me in layman's terms the moment generating function. I can't seem to connect the dots......

Thanks in advanced for any help...Paolo

P.S. I had calculus courses about 20 years ago.....

2. Sep 1, 2007

ZioX

The nth moment about the origin is defined as

$$E[X^n]=\int_{-\infty}^{+\infty}x^nf(x)dx$$

The mean is of course $E[x]=\mu$. The variance is $E[(X-E[X])^2]$ which can be shown to be equal to $E[X^2]-E[X]^2$.

The main point is that if you take the the nth derivative of the moment generating function and evaluate it at zero you get the nth moment about the origin. In symbols, $$E\left(X^n\right)=M_X^{(n)}(0)=\left.\frac{\mathrm{d}^n M_X(t)}{\mathrm{d}t^n}\right|_{t=0}$$

The proof of the continuous case is given on the wikipedia page (the notation on wikipedia slightly differs, m_i is the ith moment about the origin, so $m_i=E[X^i]$). The moment generating function can be used to calculate the mean as $M'_X(0)$ and the variance as $M''(0)-M'(0)^2$. You can look at the various wikipedia pages on particular distributions (like the poisson distribution) and calculate the means and variances from the moment generating function. If you're feeling adventurous you could calculate the moment generating function from the definition, $E[e^{tx}]$.