# Moment generating function

Estimation of x i.e. E(x) = Ʃx.p(x) ... p(x) is probabiltiy of x
Now my book defines another function mgf(x) i.e. moment generating function of x which is defined as: -
mgf(x) = E(etx)

I don't understand why was this function defined. Basically we included etx in our function because then if we differentiate it, we get the E(x) formula back again at t = 0. Why take such pain?

I can only see one use where p(x) is such that mgf(x) forms a nice taylor series of some function which can be easily differentiated but other than that I see no use.

Homework Helper
Simplest application: the moment generating function uniquely defines a distribution, so
* If you determine that two random variables have the same mgf, you know they have the same distribution
* Slightly different: Suppose you know the general form for the mgf of (say) the normal distribution - how it depends on $\mu$ and $\sigma$. If you find that a new random variable has an mgf that follows the same type of patter, you know the new variable is normally distributed as well as the values of its mean and standard deviation.

Moment generating functions can also be used to determine the distribution of a sum of several independent random variables (assuming mgfs exist for every summand)

Finally, although not every distribution has an mgf, every distribution does have something known as a characteristic function, defined as
$$\phi_x(s) = E[e^{isx}]$$

(where $i^2 = -1$). These also uniquely define distributions, and although the underlying mathematical justifications are deeper, several of the conclusions that can be drawn from c.f.s result from manipulation which are similar to those done with moment generating functions. If you've seen them for mgfs they won't be new with characteristic functions.

mathman
Think of it this way: the goal is to have a function from which you can generate moments: if you write the mgf (as defined) as a power series in t, the coefficient of $\frac{t^n}{n!}$ is $E[x^n]$.