# Gaussian Function

1. Oct 12, 2005

### mezarashi

In academics, you hear so much about the Gaussian function, whether it be in statistics, physics, or even social sciences!

The Gaussian function takes the general form of:

$$f(x) = Ae^\frac{-(x-b)^2}{c^2}$$

Further yet, the antiderivative of this function is the famous error function erf(x).

What I'd like to know is... what is the magic behind this equation. Why is it able to describe so much real world phenomena. Can it be derived or what was Mr. Gauss thinking when he came up with this.

Is there anything else I missed about the magic of this function?

2. Oct 12, 2005

### Tide

The Gaussian probability distribution is closely related to the Binomial distribution such as one finds in the case of a random walk. For example, in one dimension and when the step size is fixed then the distribution is the usual Binomial distribution and, in the limit of a very large number of steps the Gaussian distribution is an excellent approximation to the Binomial. When the step size is not fixed, such as in diffusion, the distribution is Gaussian.

Many physical processes behave like a random walk including diffusion, heat transfer and so on.

3. Oct 12, 2005

### SpaceTiger

Staff Emeritus
Of course, there's also the central limit theorem, which says basically that the sum of variables drawn from a distribution (almost any distribution) will be Gaussian distributed as the number of variables drawn approaches infinity.

4. Oct 12, 2005

### HallsofIvy

Staff Emeritus
The Central Limit Theorem, that SpaceTiger mentions, is remarkable! In any application of mathematics, you have to make SOME assumptions about what kind of "mathematical model" applies. The Central Limit Theory says that, in statistics, we really don't have to worry about that- the Gaussian distribution applies to just about everything!
If we have SOME probability distribution (the only requirement is that the mean, $mu$, and standard deviation,$\sigma$, must be finite) and take n samples from that distribution, then the sum of the samples is a Gaussian (normal) distribution with mean $n\mu$ and standard deviation $\sigma$ and the average of the samples is a Gaussian distribution with mean $\mu$ and standard deviation $\frac{\sigma}{\sqrt{n}}$.
The "normal approximation to the binomial distribution", that Tide mentions, is a special but very important example of that but it applies very generally. If a researcher is looking at people's weights, he can think of each person's weight as a sum of weight's of various parts of the body and surely they will all have the same distribution- almost automatically, he knows that people's weights must be, a least approximately, normally distributed.
Because just about everything can be thought of as the sum of many parts, it follows that almost everything must be, at least approximately normally distributed!

5. Oct 12, 2005

### mezarashi

That is indeed a powerful statement from a theory, and also a powerful distribution that can cover it all! Makes me ever amazed at mathematics we have derived to model our physical world.

/me bows to Gauss another 100 times.

6. Sep 24, 2008

### peterk007

Sadly no. It says that when you repeatedly obtain independent samples of the same underlying distribution (iid) and if this underlying distribution has finite variance then the sum/average of these samples approaches in the limit a Gaussian distribution.

There are more distributions with infinite variance around than you might image (e.g. Levy flight), and the condition of iid samples is a tough one, and nobody tells you how many are enough and it applies only to the averge of the sample. The individual samples are still distributed according to the original distribution.

The central limit theorem is a wonderful piece of mathematics, but too often too much simplified and misused.