# Moment-generating functions

## Main Question or Discussion Point

Hi,

I've been learning about MGFs in college and though the problems based on them are quite easy, I fail to understand what exactly MGFs are and why they are denoted by Mx(t)...what are we doing, essentially, when we find the MGF of a function (wherein we multiply the required function by e^(tx) and integrate within certain limits)....and finally the properties of MGFs (e.g Max+b(t)...what does this all mean????

Related Set Theory, Logic, Probability, Statistics News on Phys.org
mathman
The essential point is that the series expansion of the exponential, when integrated term by term with respect to a distribution, gives a series involving the moments of the distribution.

Right...I get that....but what does the notation Mx(t) mean? And what does it mean when we put Max+b(t)......usually when we deal with functions of x, we have the x in the brackets and not the 't'.

mathman
I am not familiar with the particular notation you are referring to. Perhaps you could give the precise definition.

chiro
Right...I get that....but what does the notation Mx(t) mean? And what does it mean when we put Max+b(t)......usually when we deal with functions of x, we have the x in the brackets and not the 't'.
The Mx(t) is the MGF for the random variable X as a function of t.

Recall that the expectation of a random variable is a number unless you supply a parameter independent of the RV (in this case t). In the process you integrate (or sum out in a discrete distribution) the x-values in the integration process and obtain a function of t. It's a lot easier to see this when you see that you integrate with respect to x which sums out this variable leaving something as a function of t.

The same kind of thing happens when you need to say find the conditional expectation of a bi-variate distribution like E[X|Y=y] where you integrate out X and get the expectation as a function of a particular y realization.

Also the other thing is that in the MGF, because t has nothing to do with the random variable X, you can do things like E[tX] = tE[X] and E[t^2/2!X^2] = t^2/2E[X^2] and this is why when you differentiate n times and set t = 0 that you actually get the moments from this MGF.

DrDu
Basically the MGF is the Laplace transform of the pdf. It has several convenient properties, ie. to calculate the moments you only have to take derivatives and not integrations (you have integrated already to calculate the MGF) which is usually simpler. In many cases you can work alternatively with the characteristic function, which is the Fourier transform of the pdf. However, in contrast to the MGF, it is not always well defined.

mathman
Mx(t) is MGF for X. Let Y = aX+b, then Max+b(t) is MGF for Y.

Ok, so as pointed out, if we expand e^(tx), and differentiate successively and put t=0 we get higher orders of expectations (E(x), E(X^2) etc.) But if we need to find the MGF of a function say f(x)= sin(x) or even f(x)=x(x-1)....then the expansion won't be the same as that of only e^tx and we might not be able to get the expectations by differentiating w.r.t 't' and putting 't=0'.....

DrDu
T But if we need to find the MGF of a function say f(x)= sin(x) or even f(x)=x(x-1)....
I don't know what ranges you have in mind but usually these functions won't qualify as pdf's as they are not positive, integrable etc.

mathman
I don't know what ranges you have in mind but usually these functions won't qualify as pdf's as they are not positive, integrable etc.
I suspect he is asking about a function of the random variable, i.e. Y=sin(X) is a random variable.

Ya that's right.....I was asking about what if f(x) = sin(x) is a random variable....

mathman