In summary, the conversation discusses the goal of calculating the probability density function (PDF) from the moment generating function (MGF) using an inverse Fourier transform. The speaker has managed to obtain the exact form of the MGF as an infinite series, but using the first few terms in the series has not yielded significant results. They mention that the random variable x is bounded, but this does not necessarily mean that t is bounded. The conversation also mentions the possibility of using the maximum entropy method to estimate the PDF.
  • #1
bombadil
52
0
My goal here is to at least approximately calculate the probability density function (PDF) given the moment generating function (MGF), [itex]M_X(t)[/itex].

I have managed to calculate the exact form of the MGF as an infinite series in [itex]t[/itex]. In principle, if I replace [itex]t[/itex] with [itex]it[/itex] and perform an inverse Fourier transform I should be able to obtain the PDF, [itex]\rho(x)[/itex], as in

[tex]
\rho(x)=\frac{1}{2\pi}\int^{\infty}_{-\infty}{e^{-ixt}M_X(it)dt}
[/tex]

I have looked into simply using the first few terms in the series expansion of [itex]M_X(it)[/itex] in the integrand of the above integral, but this hasn't yielded anything very significant. I should also mention that, since the random variable [itex]x[/itex] is bounded above and below, the actual transform has finite limits:

[tex]
\rho(x)=\frac{1}{2\pi}\int^{t_{\rm max}}_{t_{\rm min}}{e^{-ixt}M_X(it)dt}.
[/tex]

Any advice?
 
Physics news on Phys.org
  • #2


Your last assumption is incorrect. A bound for the random variable doesn't mean a bound for t. You can see for yourself by getting the Fourier transform for a random variable uniform between 0 and 1.
 
  • #3


As mathman says, the boundedness of x doesn't mean that t is bounded, but it does allow another simplification: you can replace the integral with a sum over discrete t's. (This is basically the Nyquist sampling theorem.) That is, your Fourier transform becomes a Fourier series. Does that help?

Probably not. I guess if you have a (power?) series in t, you're going to be wanting Fourier series of powers of t, which are not well-behaved.
 
  • #4


If you have the MGF as a power series, then you are saying that you know the moments of your distribution. One approach to estimation would be to look for maximum entropy distributions with the right moments that meet your other requirements (e.g. have appropriate bounds).
 
  • #5


Thanks mathman for that correction.

pmsrw3, thanks for pointing me to the maximum entropy method. Though I'm pretty sure that I'm not dealing with any of the traditional distributions, but I'll look into it.
 
  • #6


bombadil said:
pmsrw3, thanks for pointing me to the maximum entropy method. Though I'm pretty sure that I'm not dealing with any of the traditional distributions, but I'll look into it.
That's OK. The point of the maximum entropy method is to give the best (by one criterion) estimate of the actual distribution based on whatever information you have. Since you said you wanted "to at least approximately calculate the PDF", it seemed a reasonable way to go.
 
  • #7


It's also worth considering the Levy inversion formula to get the CDF directly from the CF.
 

What is the purpose of calculating PDF from MGF?

The purpose of calculating PDF (Probability Density Function) from MGF (Moment Generating Function) is to determine the probability distribution of a random variable. This can be useful in statistical analysis and modeling, as well as in understanding the characteristics of a particular data set.

How do I calculate PDF from MGF?

To calculate PDF from MGF, you will need to use the formula: PDF = 1/i * MGF^(i-1), where i is the order of the moment. This formula can be applied to both discrete and continuous random variables. It is also possible to use inverse MGF to calculate PDF, particularly for continuous random variables.

What is the relationship between MGF and PDF?

MGF and PDF are closely related in that MGF is the mathematical transform of PDF. MGF provides a way to find the moments of a random variable, while PDF gives the probability distribution of the variable. By taking the inverse transform of MGF, we can obtain the PDF.

What are some common mistakes when calculating PDF from MGF?

One common mistake when calculating PDF from MGF is forgetting to take the inverse transform. Another mistake is using the wrong formula, as there are different formulas for discrete and continuous random variables. It is also important to pay attention to the order of the moment, as this can affect the accuracy of the calculation.

What are some applications of calculating PDF from MGF?

Calculating PDF from MGF is commonly used in probability theory, statistics, and data analysis. It can be applied in various fields such as finance, engineering, and biology. Some specific applications include risk assessment, forecasting, and modeling of complex systems.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
823
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
707
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
2
Replies
36
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
Back
Top