What is the purpose of Moment-generating functions?

  • Context: Undergrad 
  • Thread starter Thread starter Urmi Roy
  • Start date Start date
  • Tags Tags
    Functions
Click For Summary

Discussion Overview

The discussion revolves around the purpose and properties of moment-generating functions (MGFs) in probability theory. Participants explore the mathematical formulation of MGFs, their notation, and their relationship to moments of random variables, as well as the implications of using MGFs for various functions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about the meaning of the notation Mx(t) and the implications of properties like Max+b(t).
  • Another participant explains that the series expansion of the exponential function, when integrated term by term, yields a series involving the moments of the distribution.
  • It is noted that Mx(t) represents the MGF for the random variable X as a function of t, and that integrating out the variable X results in a function of t.
  • A participant mentions that the MGF can be seen as the Laplace transform of the probability density function (pdf), highlighting its convenience for calculating moments through differentiation.
  • Concerns are raised about finding the MGF of functions like f(x) = sin(x) or f(x) = x(x-1), questioning their validity as pdfs and the ability to derive expectations from them.
  • Another participant clarifies that if f(x) = sin(x) is treated as a random variable, the MGF would be expressed as E(e^(sin(X)t)).

Areas of Agreement / Disagreement

Participants express differing views on the applicability of MGFs to certain functions and the interpretation of notation. There is no consensus on the validity of using MGFs for functions that do not qualify as probability density functions.

Contextual Notes

Participants note limitations regarding the conditions under which certain functions can be considered valid pdfs, as well as the assumptions necessary for the application of MGFs.

Urmi Roy
Messages
743
Reaction score
1
Hi,

I've been learning about MGFs in college and though the problems based on them are quite easy, I fail to understand what exactly MGFs are and why they are denoted by Mx(t)...what are we doing, essentially, when we find the MGF of a function (wherein we multiply the required function by e^(tx) and integrate within certain limits)...and finally the properties of MGFs (e.g Max+b(t)...what does this all mean?
 
Physics news on Phys.org
The essential point is that the series expansion of the exponential, when integrated term by term with respect to a distribution, gives a series involving the moments of the distribution.
 
Right...I get that...but what does the notation Mx(t) mean? And what does it mean when we put Max+b(t)...usually when we deal with functions of x, we have the x in the brackets and not the 't'.
 
I am not familiar with the particular notation you are referring to. Perhaps you could give the precise definition.
 
Urmi Roy said:
Right...I get that...but what does the notation Mx(t) mean? And what does it mean when we put Max+b(t)...usually when we deal with functions of x, we have the x in the brackets and not the 't'.

The Mx(t) is the MGF for the random variable X as a function of t.

Recall that the expectation of a random variable is a number unless you supply a parameter independent of the RV (in this case t). In the process you integrate (or sum out in a discrete distribution) the x-values in the integration process and obtain a function of t. It's a lot easier to see this when you see that you integrate with respect to x which sums out this variable leaving something as a function of t.

The same kind of thing happens when you need to say find the conditional expectation of a bi-variate distribution like E[X|Y=y] where you integrate out X and get the expectation as a function of a particular y realization.

Also the other thing is that in the MGF, because t has nothing to do with the random variable X, you can do things like E[tX] = tE[X] and E[t^2/2!X^2] = t^2/2E[X^2] and this is why when you differentiate n times and set t = 0 that you actually get the moments from this MGF.
 
Basically the MGF is the Laplace transform of the pdf. It has several convenient properties, ie. to calculate the moments you only have to take derivatives and not integrations (you have integrated already to calculate the MGF) which is usually simpler. In many cases you can work alternatively with the characteristic function, which is the Fourier transform of the pdf. However, in contrast to the MGF, it is not always well defined.
 
Mx(t) is MGF for X. Let Y = aX+b, then Max+b(t) is MGF for Y.
 
Thanks for your replies!

Ok, so as pointed out, if we expand e^(tx), and differentiate successively and put t=0 we get higher orders of expectations (E(x), E(X^2) etc.) But if we need to find the MGF of a function say f(x)= sin(x) or even f(x)=x(x-1)...then the expansion won't be the same as that of only e^tx and we might not be able to get the expectations by differentiating w.r.t 't' and putting 't=0'...
 
Urmi Roy said:
T But if we need to find the MGF of a function say f(x)= sin(x) or even f(x)=x(x-1)...
I don't know what ranges you have in mind but usually these functions won't qualify as pdf's as they are not positive, integrable etc.
 
  • #10
DrDu said:
I don't know what ranges you have in mind but usually these functions won't qualify as pdf's as they are not positive, integrable etc.

I suspect he is asking about a function of the random variable, i.e. Y=sin(X) is a random variable.
 
  • #11
Ya that's right...I was asking about what if f(x) = sin(x) is a random variable...
 
  • #12
Urmi Roy said:
Ya that's right...I was asking about what if f(x) = sin(x) is a random variable...

In that case the MGF is E(esin(X)t).
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K