# Estimating a Probability Distribution

1. Aug 26, 2012

### thegreenlaser

This is hopefully a simple question...

Given the first n moments or central moments or cumulants (I don't care which) of a probability distribution, is there a standard procedure for estimating its functional form?

For example, I know that given the mean and variance of a distribution, it's fairly standard to assume that it's Gaussian. Is there a more general method?

2. Aug 26, 2012

### Mute

Hm. Well, if you know the first n moments, $\mu_k$, you can construct a polynomial function

$$\tilde{\varphi}_X(t) = 1+\sum_{k=1}^n \frac{\mu_k}{k!} (it)^k,$$

where $i$ is the imaginary unit, which would be a polynomial approximation to the characteristic function $\varphi_X(t)$ of your distribution. The characteristic function of a random variable x is defined as

$$\varphi_X(t) = \mathbb{E}[\exp(itx)];$$
if you know $\varphi_X(t)$, you can find the distribution by inverse fourier transform, i.e.,

$$\rho_X(x) = \int_{-\infty}^\infty \frac{dt}{2\pi} e^{-itx}\varphi_X(t).$$

In your case, you could numerically inverse fourier transform your polynomial approximation $\tilde{\varphi}_X(t)$, which would give you a numerical estimate of the shape of the probability distribution. It won't tell you if it's a standard distribution like the normal distribution or something, but you'll know what it looks like.

(Since you mentioned probability distribution, I am assuming that your random variables are continuous, rather than discrete.)

The references in this part of the wikipedia page on characteristic functions may be useful.

3. Aug 30, 2012

### thegreenlaser

Thanks, that's exactly the kind of thing I was looking for!