Estimating a Probability Distribution

Click For Summary
SUMMARY

Estimating the functional form of a probability distribution can be achieved using the first n moments, central moments, or cumulants. A standard method involves constructing a polynomial function, represented as $$\tilde{\varphi}_X(t) = 1+\sum_{k=1}^n \frac{\mu_k}{k!} (it)^k$$, which approximates the characteristic function $$\varphi_X(t)$$. By applying the inverse Fourier transform, $$\rho_X(x) = \int_{-\infty}^\infty \frac{dt}{2\pi} e^{-itx}\varphi_X(t)$$, one can numerically estimate the shape of the probability distribution. This approach does not confirm if the distribution is standard, such as Gaussian, but provides a visual representation of its form.

PREREQUISITES
  • Understanding of probability distributions and their properties
  • Familiarity with moments, central moments, and cumulants
  • Knowledge of characteristic functions in probability theory
  • Experience with numerical methods for inverse Fourier transforms
NEXT STEPS
  • Research the properties of characteristic functions in probability theory
  • Learn about numerical methods for performing inverse Fourier transforms
  • Explore polynomial approximations in statistical modeling
  • Study the implications of different moments on distribution shapes
USEFUL FOR

Statisticians, data scientists, and researchers involved in probability theory and statistical modeling who are looking to estimate and visualize probability distributions from moments.

thegreenlaser
Messages
524
Reaction score
17
This is hopefully a simple question...

Given the first n moments or central moments or cumulants (I don't care which) of a probability distribution, is there a standard procedure for estimating its functional form?

For example, I know that given the mean and variance of a distribution, it's fairly standard to assume that it's Gaussian. Is there a more general method?
 
Physics news on Phys.org
Hm. Well, if you know the first n moments, ##\mu_k##, you can construct a polynomial function

$$\tilde{\varphi}_X(t) = 1+\sum_{k=1}^n \frac{\mu_k}{k!} (it)^k,$$

where ##i## is the imaginary unit, which would be a polynomial approximation to the characteristic function ##\varphi_X(t)## of your distribution. The characteristic function of a random variable x is defined as

$$\varphi_X(t) = \mathbb{E}[\exp(itx)];$$
if you know ##\varphi_X(t)##, you can find the distribution by inverse Fourier transform, i.e.,

$$\rho_X(x) = \int_{-\infty}^\infty \frac{dt}{2\pi} e^{-itx}\varphi_X(t).$$

In your case, you could numerically inverse Fourier transform your polynomial approximation ##\tilde{\varphi}_X(t)##, which would give you a numerical estimate of the shape of the probability distribution. It won't tell you if it's a standard distribution like the normal distribution or something, but you'll know what it looks like.

(Since you mentioned probability distribution, I am assuming that your random variables are continuous, rather than discrete.)

The references in this part of the wikipedia page on characteristic functions may be useful.
 
Thanks, that's exactly the kind of thing I was looking for!
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K