Estimating a Probability Distribution

  • #1
525
16

Main Question or Discussion Point

This is hopefully a simple question...

Given the first n moments or central moments or cumulants (I don't care which) of a probability distribution, is there a standard procedure for estimating its functional form?

For example, I know that given the mean and variance of a distribution, it's fairly standard to assume that it's Gaussian. Is there a more general method?
 

Answers and Replies

  • #2
Mute
Homework Helper
1,388
10
Hm. Well, if you know the first n moments, ##\mu_k##, you can construct a polynomial function

$$\tilde{\varphi}_X(t) = 1+\sum_{k=1}^n \frac{\mu_k}{k!} (it)^k,$$

where ##i## is the imaginary unit, which would be a polynomial approximation to the characteristic function ##\varphi_X(t)## of your distribution. The characteristic function of a random variable x is defined as

$$\varphi_X(t) = \mathbb{E}[\exp(itx)];$$
if you know ##\varphi_X(t)##, you can find the distribution by inverse fourier transform, i.e.,

$$\rho_X(x) = \int_{-\infty}^\infty \frac{dt}{2\pi} e^{-itx}\varphi_X(t).$$

In your case, you could numerically inverse fourier transform your polynomial approximation ##\tilde{\varphi}_X(t)##, which would give you a numerical estimate of the shape of the probability distribution. It won't tell you if it's a standard distribution like the normal distribution or something, but you'll know what it looks like.

(Since you mentioned probability distribution, I am assuming that your random variables are continuous, rather than discrete.)

The references in this part of the wikipedia page on characteristic functions may be useful.
 
  • #3
525
16
Thanks, that's exactly the kind of thing I was looking for!
 

Related Threads on Estimating a Probability Distribution

  • Last Post
Replies
2
Views
1K
Replies
1
Views
6K
  • Last Post
Replies
2
Views
2K
Replies
6
Views
831
Replies
14
Views
4K
Replies
2
Views
3K
Replies
6
Views
3K
Replies
10
Views
1K
Replies
3
Views
4K
Top