Calculating PDF from MGF: Advice Needed

Click For Summary

Discussion Overview

The discussion revolves around the process of calculating the probability density function (PDF) from a moment generating function (MGF). Participants explore various mathematical approaches, including the use of Fourier transforms and series expansions, while addressing the implications of boundedness in random variables.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant proposes using the inverse Fourier transform of the MGF to derive the PDF, noting the integral's limits due to the bounded nature of the random variable.
  • Another participant challenges the assumption that the boundedness of the random variable implies boundedness in the variable t, citing the Fourier transform of a uniform distribution as an example.
  • A different participant suggests that the boundedness of x allows for a simplification to a sum over discrete t's, relating this to the Nyquist sampling theorem and indicating potential issues with Fourier series of powers of t.
  • One participant mentions that knowing the MGF as a power series provides the moments of the distribution and suggests using maximum entropy distributions to estimate the PDF under certain constraints.
  • Participants express uncertainty about whether the distribution is traditional and discuss the relevance of the maximum entropy method for approximating the PDF.
  • Another participant introduces the idea of using the Levy inversion formula to derive the cumulative distribution function (CDF) directly from the characteristic function (CF).

Areas of Agreement / Disagreement

Participants generally agree on the mathematical approaches to derive the PDF from the MGF, but there is disagreement regarding the implications of boundedness and the applicability of certain methods. The discussion remains unresolved with multiple competing views presented.

Contextual Notes

Participants highlight limitations related to assumptions about boundedness and the behavior of Fourier transforms and series, as well as the potential challenges in applying maximum entropy methods to non-traditional distributions.

bombadil
Messages
51
Reaction score
0
My goal here is to at least approximately calculate the probability density function (PDF) given the moment generating function (MGF), M_X(t).

I have managed to calculate the exact form of the MGF as an infinite series in t. In principle, if I replace t with it and perform an inverse Fourier transform I should be able to obtain the PDF, \rho(x), as in

<br /> \rho(x)=\frac{1}{2\pi}\int^{\infty}_{-\infty}{e^{-ixt}M_X(it)dt}<br />

I have looked into simply using the first few terms in the series expansion of M_X(it) in the integrand of the above integral, but this hasn't yielded anything very significant. I should also mention that, since the random variable x is bounded above and below, the actual transform has finite limits:

<br /> \rho(x)=\frac{1}{2\pi}\int^{t_{\rm max}}_{t_{\rm min}}{e^{-ixt}M_X(it)dt}.<br />

Any advice?
 
Physics news on Phys.org


Your last assumption is incorrect. A bound for the random variable doesn't mean a bound for t. You can see for yourself by getting the Fourier transform for a random variable uniform between 0 and 1.
 


As mathman says, the boundedness of x doesn't mean that t is bounded, but it does allow another simplification: you can replace the integral with a sum over discrete t's. (This is basically the Nyquist sampling theorem.) That is, your Fourier transform becomes a Fourier series. Does that help?

Probably not. I guess if you have a (power?) series in t, you're going to be wanting Fourier series of powers of t, which are not well-behaved.
 


If you have the MGF as a power series, then you are saying that you know the moments of your distribution. One approach to estimation would be to look for maximum entropy distributions with the right moments that meet your other requirements (e.g. have appropriate bounds).
 


Thanks mathman for that correction.

pmsrw3, thanks for pointing me to the maximum entropy method. Though I'm pretty sure that I'm not dealing with any of the traditional distributions, but I'll look into it.
 


bombadil said:
pmsrw3, thanks for pointing me to the maximum entropy method. Though I'm pretty sure that I'm not dealing with any of the traditional distributions, but I'll look into it.
That's OK. The point of the maximum entropy method is to give the best (by one criterion) estimate of the actual distribution based on whatever information you have. Since you said you wanted "to at least approximately calculate the PDF", it seemed a reasonable way to go.
 


It's also worth considering the Levy inversion formula to get the CDF directly from the CF.
 

Similar threads

Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K