Moment Generating Function (proof of definition)


by Oxymoron
Tags: definition, function, generating, moment, proof
Oxymoron
Oxymoron is offline
#1
Mar24-11, 06:45 AM
P: 867
1. The problem statement, all variables and given/known data
Prove that for a random variable [tex]X[/tex] with continuous probability distribution function [tex]f_X(x)[/tex] that the Moment Generating Function, defined as

[tex]
M_X(t) := E[e^{tX}]
[/tex]

is

[tex]
M_X(t) = \int_x^{\infty}e^{tx}f_X(x)dx
[/tex]

2. Relevant equations

Above and

[tex]
E[X] = \int_{-\infty}^{\infty}xf_X(x)dx
[/tex]

3. The attempt at a solution

This expression is given in so many textbooks and the ones that I have read all skip over this derivation. I want to be able to prove (1) to myself.

Proof:
Write the exponential function as a Maclaurin series:

[tex]
M_X(t) = E[e^{tX}]
[/tex]

[tex]
= E[1+tX+\frac{t^2}{2!}X^2+\frac{t^3}{3!}X^3+...]
[/tex]

Since [tex]E[1] = 1[/tex] and the [tex]E[t^n/n!]=t^n/n![/tex] because they are constant and the expectation of a constant is itself you get:

[tex]
= 1+tE[X]+\frac{t^2}{2!}E[X^2]+\frac{t^3}{3!}E[X^3]+...
[/tex]


...also using the linearity of E. Now, writing the series as a sum:

[tex]
=\sum_{t=0}^{\infty}\frac{t^n}{n!}E[X^n]
[/tex]

And extracting the exponential:

[tex]
=e^t\sum_{n=0}^{\infty}E[X^n]
[/tex]

Now I am stuck! I know that I am meant to use

[tex]
E[X] = \int_{-\infty}^{\infty}xf_X(x)dx
[/tex]

but I have [tex]E[X^n][/tex] and I also have [tex]e^t[/tex] and not [tex]e^{tx}[/tex].
Phys.Org News Partner Science news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
The hemihelix: Scientists discover a new shape using rubber bands (w/ video)
Microbes provide insights into evolution of human language
HallsofIvy
HallsofIvy is offline
#2
Mar24-11, 07:02 AM
Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,898
There isn't really much to prove. For any continuous probability distribution with density function f(x), the Expected value of any function u(x) is defined to be
[tex]E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx[/itex]

Replace [itex]u(x)[/itex] with [itex]e^{tx}[/itex] and you have it. It is true that the whole point of the "moment generating function" is that the coefficients of the powers of x in a power series expansion are the "moments" of the probability distribution, but that doesn't seem to me to be relevant to this question. I see no reason to write its Taylor series.
Oxymoron
Oxymoron is offline
#3
Mar24-11, 07:17 AM
P: 867
Good, okay that makes sense.

Then I suppose all I had to do was prove

[tex]
E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx
[/tex]

and then substitute [tex]u(x)[/tex] with [tex]e^{tx}[/tex] as you said and I'm done.

But once again there is nothing to prove because it is a definition.


Register to reply

Related Discussions
Finding the Probability distribution function given Moment Generating Function Calculus & Beyond Homework 7
Definition of moment generating function General Math 2
moment generating function help? Set Theory, Logic, Probability, Statistics 1
Moment Generating Function Calculus & Beyond Homework 11
Moment generating function Calculus & Beyond Homework 4