Moment Generating Function (proof of definition)

Click For Summary
SUMMARY

The Moment Generating Function (MGF) for a random variable X with a continuous probability distribution function f_X(x) is defined as M_X(t) = E[e^{tX}]. This can be expressed as M_X(t) = \int_{-\infty}^{\infty} e^{tx} f_X(x) dx. The proof involves using the Maclaurin series expansion of the exponential function and the linearity of expectation, ultimately leading to the conclusion that E[u(x)] = \int_{-\infty}^{\infty} u(x) f(x) dx, with u(x) replaced by e^{tx} to derive the MGF.

PREREQUISITES
  • Understanding of continuous probability distributions
  • Familiarity with the concept of expectation in probability theory
  • Knowledge of Maclaurin series expansion
  • Basic calculus, particularly integration techniques
NEXT STEPS
  • Study the properties of Moment Generating Functions in probability theory
  • Learn about the applications of MGFs in deriving moments of distributions
  • Explore the relationship between MGFs and characteristic functions
  • Review integration techniques for continuous functions in probability
USEFUL FOR

Students of statistics, mathematicians, and anyone interested in understanding the derivation and application of Moment Generating Functions in probability theory.

Oxymoron
Messages
868
Reaction score
0

Homework Statement


Prove that for a random variable [tex]X[/tex] with continuous probability distribution function [tex]f_X(x)[/tex] that the Moment Generating Function, defined as

[tex] M_X(t) := E[e^{tX}][/tex]

is

[tex] M_X(t) = \int_x^{\infty}e^{tx}f_X(x)dx [/tex]

Homework Equations



Above and

[tex] E[X] = \int_{-\infty}^{\infty}xf_X(x)dx[/tex]

The Attempt at a Solution



This expression is given in so many textbooks and the ones that I have read all skip over this derivation. I want to be able to prove (1) to myself.

Proof:
Write the exponential function as a Maclaurin series:

[tex] M_X(t) = E[e^{tX}] [/tex]

[tex] = E[1+tX+\frac{t^2}{2!}X^2+\frac{t^3}{3!}X^3+...][/tex]

Since [tex]E[1] = 1[/tex] and the [tex]E[t^n/n!]=t^n/n![/tex] because they are constant and the expectation of a constant is itself you get:

[tex] = 1+tE[X]+\frac{t^2}{2!}E[X^2]+\frac{t^3}{3!}E[X^3]+...[/tex]...also using the linearity of E. Now, writing the series as a sum:

[tex] =\sum_{t=0}^{\infty}\frac{t^n}{n!}E[X^n][/tex]

And extracting the exponential:

[tex] =e^t\sum_{n=0}^{\infty}E[X^n][/tex]

Now I am stuck! I know that I am meant to use

[tex] E[X] = \int_{-\infty}^{\infty}xf_X(x)dx[/tex]

but I have [tex]E[X^n][/tex] and I also have [tex]e^t[/tex] and not [tex]e^{tx}[/tex].
 
Last edited:
Physics news on Phys.org
There isn't really much to prove. For any continuous probability distribution with density function f(x), the Expected value of any function u(x) is defined to be
[tex]E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx[/itex]<br /> <br /> Replace [itex]u(x)[/itex] with [itex]e^{tx}[/itex] and you have it. It is true that the whole point of the "moment generating function" is that the coefficients of the powers of x in a power series expansion are the "moments" of the probability distribution, but that doesn't seem to me to be relevant to this question. I see no reason to write its Taylor series.[/tex]
 
Good, okay that makes sense.

Then I suppose all I had to do was prove

[tex] E(u(x))= \int_{-\infty}^\infty u(x)f(x)dx[/tex]

and then substitute [tex]u(x)[/tex] with [tex]e^{tx}[/tex] as you said and I'm done.

But once again there is nothing to prove because it is a definition.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
Replies
1
Views
3K
Replies
6
Views
3K
  • · Replies 31 ·
2
Replies
31
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K