The Magic of the Gaussian Function

  • Thread starter Thread starter mezarashi
  • Start date Start date
  • Tags Tags
    Function Gaussian
Click For Summary
The Gaussian function is widely recognized in various fields such as statistics, physics, and social sciences due to its ability to describe real-world phenomena. Its antiderivative, the error function erf(x), plays a significant role in this context. The Central Limit Theorem asserts that the sum of independent samples from any distribution with finite variance approaches a Gaussian distribution, highlighting its broad applicability. This theorem implies that many processes, including random walks and diffusion, can be modeled using Gaussian distributions. The discussion emphasizes the profound impact of the Gaussian function and the Central Limit Theorem on mathematical modeling in the physical world.
mezarashi
Homework Helper
Messages
652
Reaction score
0
In academics, you hear so much about the Gaussian function, whether it be in statistics, physics, or even social sciences!

The Gaussian function takes the general form of:

f(x) = Ae^\frac{-(x-b)^2}{c^2}

Further yet, the antiderivative of this function is the famous error function erf(x).

What I'd like to know is... what is the magic behind this equation. Why is it able to describe so much real world phenomena. Can it be derived or what was Mr. Gauss thinking when he came up with this.

Is there anything else I missed about the magic of this function?
 
Mathematics news on Phys.org
The Gaussian probability distribution is closely related to the Binomial distribution such as one finds in the case of a random walk. For example, in one dimension and when the step size is fixed then the distribution is the usual Binomial distribution and, in the limit of a very large number of steps the Gaussian distribution is an excellent approximation to the Binomial. When the step size is not fixed, such as in diffusion, the distribution is Gaussian.

Many physical processes behave like a random walk including diffusion, heat transfer and so on.
 
Of course, there's also the central limit theorem, which says basically that the sum of variables drawn from a distribution (almost any distribution) will be Gaussian distributed as the number of variables drawn approaches infinity.
 
The Central Limit Theorem, that SpaceTiger mentions, is remarkable! In any application of mathematics, you have to make SOME assumptions about what kind of "mathematical model" applies. The Central Limit Theory says that, in statistics, we really don't have to worry about that- the Gaussian distribution applies to just about everything!
If we have SOME probability distribution (the only requirement is that the mean, mu, and standard deviation,\sigma, must be finite) and take n samples from that distribution, then the sum of the samples is a Gaussian (normal) distribution with mean n\mu and standard deviation \sigma and the average of the samples is a Gaussian distribution with mean \mu and standard deviation \frac{\sigma}{\sqrt{n}}.
The "normal approximation to the binomial distribution", that Tide mentions, is a special but very important example of that but it applies very generally. If a researcher is looking at people's weights, he can think of each person's weight as a sum of weight's of various parts of the body and surely they will all have the same distribution- almost automatically, he knows that people's weights must be, a least approximately, normally distributed.
Because just about everything can be thought of as the sum of many parts, it follows that almost everything must be, at least approximately normally distributed!
 
HallsofIvy said:
The Central Limit Theory says that, in statistics, we really don't have to worry about that- the Gaussian distribution applies to just about everything!

That is indeed a powerful statement from a theory, and also a powerful distribution that can cover it all! Makes me ever amazed at mathematics we have derived to model our physical world.

/me bows to Gauss another 100 times.
 
SpaceTiger said:
Of course, there's also the central limit theorem, which says basically that the sum of variables drawn from a distribution (almost any distribution) will be Gaussian distributed as the number of variables drawn approaches infinity.

Sadly no. It says that when you repeatedly obtain independent samples of the same underlying distribution (iid) and if this underlying distribution has finite variance then the sum/average of these samples approaches in the limit a Gaussian distribution.

There are more distributions with infinite variance around than you might image (e.g. Levy flight), and the condition of iid samples is a tough one, and nobody tells you how many are enough and it applies only to the averge of the sample. The individual samples are still distributed according to the original distribution.

The central limit theorem is a wonderful piece of mathematics, but too often too much simplified and misused.
 
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 31 ·
2
Replies
31
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K