# Estimating a Probability Distribution

• thegreenlaser
In summary, the conversation discusses the method for estimating the functional form of a probability distribution based on its first n moments or central moments or cumulants. It is possible to construct a polynomial approximation to the characteristic function of the distribution using the first n moments. By numerically inverse Fourier transforming this polynomial, a numerical estimate of the distribution's shape can be obtained. However, this method does not determine if the distribution is a standard one like the normal distribution.

#### thegreenlaser

This is hopefully a simple question...

Given the first n moments or central moments or cumulants (I don't care which) of a probability distribution, is there a standard procedure for estimating its functional form?

For example, I know that given the mean and variance of a distribution, it's fairly standard to assume that it's Gaussian. Is there a more general method?

Hm. Well, if you know the first n moments, ##\mu_k##, you can construct a polynomial function

$$\tilde{\varphi}_X(t) = 1+\sum_{k=1}^n \frac{\mu_k}{k!} (it)^k,$$

where ##i## is the imaginary unit, which would be a polynomial approximation to the characteristic function ##\varphi_X(t)## of your distribution. The characteristic function of a random variable x is defined as

$$\varphi_X(t) = \mathbb{E}[\exp(itx)];$$
if you know ##\varphi_X(t)##, you can find the distribution by inverse Fourier transform, i.e.,

$$\rho_X(x) = \int_{-\infty}^\infty \frac{dt}{2\pi} e^{-itx}\varphi_X(t).$$

In your case, you could numerically inverse Fourier transform your polynomial approximation ##\tilde{\varphi}_X(t)##, which would give you a numerical estimate of the shape of the probability distribution. It won't tell you if it's a standard distribution like the normal distribution or something, but you'll know what it looks like.

(Since you mentioned probability distribution, I am assuming that your random variables are continuous, rather than discrete.)

The references in this part of the wikipedia page on characteristic functions may be useful.

Thanks, that's exactly the kind of thing I was looking for!

## 1. What is a probability distribution?

A probability distribution is a mathematical function that describes the likelihood of a certain outcome occurring in a random experiment or event. It assigns a probability to each possible outcome, and the sum of all probabilities is equal to 1.

## 2. Why is it important to estimate a probability distribution?

Estimating a probability distribution allows us to make predictions and analyze data in a meaningful way. It helps us understand the likelihood of different outcomes and make informed decisions based on the data.

## 3. What are some common methods for estimating a probability distribution?

Some common methods for estimating a probability distribution include maximum likelihood estimation, Bayesian estimation, and least squares estimation. These methods involve using data and statistical techniques to determine the parameters of the distribution that best fit the data.

## 4. How do you interpret the results of a estimated probability distribution?

The results of an estimated probability distribution can be interpreted in terms of the likelihood of different outcomes occurring. For example, if the distribution is bell-shaped, it means that outcomes near the center are more likely to occur than outcomes near the edges. Additionally, the shape of the distribution can provide insights into the variability of the data.

## 5. Can you estimate a probability distribution without data?

No, it is not possible to estimate a probability distribution without data. The purpose of estimating a probability distribution is to make predictions and analyze data, so we need data to determine the parameters of the distribution and evaluate its fit to the data.