Calculating PDF from MGF: Advice Needed

AI Thread Summary
The discussion focuses on calculating the probability density function (PDF) from a moment generating function (MGF) using Fourier transforms. The original poster has derived the MGF as an infinite series but struggles to obtain significant results when substituting into the integral for the PDF. Participants clarify that while the random variable x is bounded, this does not imply that t is bounded, suggesting a shift to a Fourier series approach. They also recommend exploring the maximum entropy method to estimate the distribution based on known moments. Additionally, the Levy inversion formula is mentioned as a potential alternative for deriving the cumulative distribution function (CDF).
bombadil
Messages
51
Reaction score
0
My goal here is to at least approximately calculate the probability density function (PDF) given the moment generating function (MGF), M_X(t).

I have managed to calculate the exact form of the MGF as an infinite series in t. In principle, if I replace t with it and perform an inverse Fourier transform I should be able to obtain the PDF, \rho(x), as in

<br /> \rho(x)=\frac{1}{2\pi}\int^{\infty}_{-\infty}{e^{-ixt}M_X(it)dt}<br />

I have looked into simply using the first few terms in the series expansion of M_X(it) in the integrand of the above integral, but this hasn't yielded anything very significant. I should also mention that, since the random variable x is bounded above and below, the actual transform has finite limits:

<br /> \rho(x)=\frac{1}{2\pi}\int^{t_{\rm max}}_{t_{\rm min}}{e^{-ixt}M_X(it)dt}.<br />

Any advice?
 
Physics news on Phys.org


Your last assumption is incorrect. A bound for the random variable doesn't mean a bound for t. You can see for yourself by getting the Fourier transform for a random variable uniform between 0 and 1.
 


As mathman says, the boundedness of x doesn't mean that t is bounded, but it does allow another simplification: you can replace the integral with a sum over discrete t's. (This is basically the Nyquist sampling theorem.) That is, your Fourier transform becomes a Fourier series. Does that help?

Probably not. I guess if you have a (power?) series in t, you're going to be wanting Fourier series of powers of t, which are not well-behaved.
 


If you have the MGF as a power series, then you are saying that you know the moments of your distribution. One approach to estimation would be to look for maximum entropy distributions with the right moments that meet your other requirements (e.g. have appropriate bounds).
 


Thanks mathman for that correction.

pmsrw3, thanks for pointing me to the maximum entropy method. Though I'm pretty sure that I'm not dealing with any of the traditional distributions, but I'll look into it.
 


bombadil said:
pmsrw3, thanks for pointing me to the maximum entropy method. Though I'm pretty sure that I'm not dealing with any of the traditional distributions, but I'll look into it.
That's OK. The point of the maximum entropy method is to give the best (by one criterion) estimate of the actual distribution based on whatever information you have. Since you said you wanted "to at least approximately calculate the PDF", it seemed a reasonable way to go.
 


It's also worth considering the Levy inversion formula to get the CDF directly from the CF.
 
Back
Top