Consider the Gaussian Distribution....?

In summary: I don't understand why, or how we decide to put these two functions together.In summary, the Gaussian distribution is a probability distribution that is commonly used to describe the cumulative effect of many small independent errors. To determine its normalization constant, a textbook trick involves evaluating the product of two normalization integrals in different variables and converting to polar coordinates. This eliminates the need to use the error function. The process involves combining two exponential functions, e^-x^2 and e^-y^2, to get the desired exponential e^-(x^2+y^2).
  • #1
Destroxia
204
7

Homework Statement



Consider the Gaussian Distribution

## p(x) = Ae^{-\lambda(x-a)^{2}} ##,

where ## A ##, ##a##, and ##\lambda## are constants. (Look up any integrals you need.)

(a) Determine ##A##

(I only need help with this (a))

Homework Equations



##\int_{-\infty}^{\infty} p(x)dx = 1##

##\langle x \rangle = \int_{-\infty}^{\infty} xp(x)dx##

##\sigma^{2} \equiv \langle (\Delta x)^{2}) \rangle = \langle x^{2} \rangle - \langle x \rangle^{2} ##

## \int e^{-ax^{2}} = \sqrt{\frac {\pi} {a}}## erf##(x)##

## \int e^{ax^{2}} = -i \sqrt{\frac {\pi} {a}}## erf##(x)## (Where does the -i and sqrt come from?)

The Attempt at a Solution



## p(x) = Ae^{-\lambda(x-a)^{2}} ##

## 1 = A \int_{-\infty}^{\infty} e^{-\lambda(x-a)^{2}}dx##

## u = x-a, du=dx, u: -\infty ## to ## \infty ##

## 1 = A \int_{-\infty}^{\infty} e^{-\lambda u^{2}}dx ##

## 1 = A[\sqrt{\frac {\pi} {\lambda}}## erf##(x) ## ##]_{-\infty}^{\infty}##

(stuck here... how to evaluate this...?)

Okay, so... I have a few questions here...

1) I know that ## \int e^{ax^{2}}## does not evaluate in terms of elementary functions, but an error function... but why does the ##-i## and ##\sqrt{\frac {\pi} {a}}## come into play, being multiplied by the erf(x)?

2) What exactly is a gaussian distribution, and what is the significance of it?

3) How do I evaluate the erf(x) function from ##\infty## to ##\infty##. The book's solution shows it disappears away after they evaluate from ##-\infty## to ##\infty##... ?

4) The answer for A in the book ends up being ## A = \sqrt{\frac {\lambda} {\pi}}##, which I see how they got that, but where did the error function go to in the evaluation? Do we just consider it not there? Shouldn't it be ## A = \sqrt{\frac {\lambda} {\pi}}## erf##(x)##?
 
Physics news on Phys.org
  • #2
There is a textbook trick for determining the normalisation constant ##A##. It involves evaluating the product of two normalisation integrals, one in the variable ##x## and one in the variable ##y##, and a change to polar coordinates. There is no need to bother with the error function at any point. Maybe try to play with this hint a bit?

The Gaussian distribution is ubiquitous in probability, because in some precise sense (look up "central limit theorem" and "convergence in distribution" or "weak convergence" in a good probability textbook), the sum of ##n## independent and identically distributed random variables is approximately distributed according to a Gaussian, when ##n## is large. This explains, for example, why in the sciences certain deviations from a mean are often supposed to follow a Gaussian distribution, because they are assumed to be the cumulative effect of many (small) independent errors.
 
  • #3
RyanTAsher said:

Homework Statement



Consider the Gaussian Distribution

## p(x) = Ae^{-\lambda(x-a)^{2}} ##,

where ## A ##, ##a##, and ##\lambda## are constants. (Look up any integrals you need.)

(a) Determine ##A##

(I only need help with this (a))

Homework Equations



##\int_{-\infty}^{\infty} p(x)dx = 1##

##\langle x \rangle = \int_{-\infty}^{\infty} xp(x)dx##

##\sigma^{2} \equiv \langle (\Delta x)^{2}) \rangle = \langle x^{2} \rangle - \langle x \rangle^{2} ##

## \int e^{-ax^{2}} = \sqrt{\frac {\pi} {a}}## erf##(x)##

## \int e^{ax^{2}} = -i \sqrt{\frac {\pi} {a}}## erf##(x)## (Where does the -i and sqrt come from?)

The Attempt at a Solution



## p(x) = Ae^{-\lambda(x-a)^{2}} ##

## 1 = A \int_{-\infty}^{\infty} e^{-\lambda(x-a)^{2}}dx##

## u = x-a, du=dx, u: -\infty ## to ## \infty ##

## 1 = A \int_{-\infty}^{\infty} e^{-\lambda u^{2}}dx ##

## 1 = A[\sqrt{\frac {\pi} {\lambda}}## erf##(x) ## ##]_{-\infty}^{\infty}##

(stuck here... how to evaluate this...?)

Okay, so... I have a few questions here...

1) I know that ## \int e^{ax^{2}}## does not evaluate in terms of elementary functions, but an error function... but why does the ##-i## and ##\sqrt{\frac {\pi} {a}}## come into play, being multiplied by the erf(x)?

2) What exactly is a gaussian distribution, and what is the significance of it?

3) How do I evaluate the erf(x) function from ##\infty## to ##\infty##. The book's solution shows it disappears away after they evaluate from ##-\infty## to ##\infty##... ?

4) The answer for A in the book ends up being ## A = \sqrt{\frac {\lambda} {\pi}}##, which I see how they got that, but where did the error function go to in the evaluation? Do we just consider it not there? Shouldn't it be ## A = \sqrt{\frac {\lambda} {\pi}}## erf##(x)##?

What is preventing you from looking up 'Gaussian integral' in Google? All your questions will be answered!
 
  • #4
Krylov said:
There is a textbook trick for determining the normalisation constant ##A##. It involves evaluating the product of two normalisation integrals, one in the variable ##x## and one in the variable ##y##, and a change to polar coordinates. There is no need to bother with the error function at any point. Maybe try to play with this hint a bit?

The Gaussian distribution is ubiquitous in probability, because in some precise sense (look up "central limit theorem" and "convergence in distribution" or "weak convergence" in a good probability textbook), the sum of ##n## independent and identically distributed random variables is approximately distributed according to a Gaussian, when ##n## is large. This explains, for example, why in the sciences certain deviations from a mean are often supposed to follow a Gaussian distribution, because they are assumed to be the cumulative effect of many (small) independent errors.

Ray Vickson said:
What is preventing you from looking up 'Gaussian integral' in Google? All your questions will be answered!

I did search this process, but I didnt really understand the normalization process with the two variables, and converting to polar coordinate.

I wanted to see if i could possibly evaluate it in terms of the erf(x) function, but it appears to all cancel out. So, my follow up question after searching the internet is what exactly makes this normalization process equivalent? I just don't understand how you can get the exponential from -x^2 to -(x^2+y^2)... i understand how e^-x^2*e^-y^2 gets us to there, but I don't understand why, or how we decide to put these two functions together.
 
  • #5
RyanTAsher said:
I did search this process, but I didnt really understand the normalization process with the two variables, and converting to polar coordinate.

I wanted to see if i could possibly evaluate it in terms of the erf(x) function, but it appears to all cancel out. So, my follow up question after searching the internet is what exactly makes this normalization process equivalent? I just don't understand how you can get the exponential from -x^2 to -(x^2+y^2)... i understand how e^-x^2*e^-y^2 gets us to there, but I don't understand why, or how we decide to put these two functions together.

Just switch from rectangular coordinates ##(x,y)## to polar coordinates ##(r, \theta)##. That trick works because you are integrating over the whole 2-dimensional plane, so you don't run into any difficult "boundary" issues. It would not work for a finite integration such as ##I = \int_{x=a}^b e^{-x^2} \, dx##, because now ##I^2## would be a two-dimensional integration over a finite square in the plane, and switching to polar coordinates would give horrible boundary effects that are almost impossible to deal with.

The article http://www.math.uconn.edu/~kconrad/blurbs/analysis/gaussianintegral.pdf gives ten different proofs of the result, ranging from rather elementary to quite advanced.
 
Last edited:

What is the Gaussian Distribution?

The Gaussian Distribution, also known as the normal distribution, is a probability distribution that is commonly used to model the distribution of a continuous variable. It is characterized by a bell-shaped curve, with the majority of the data falling within 3 standard deviations of the mean.

What are the key features of the Gaussian Distribution?

The key features of the Gaussian Distribution include the mean, which represents the average or central tendency of the data, and the standard deviation, which measures the spread or variability of the data. The distribution is symmetrical and follows the 68-95-99.7 rule, where 68% of the data falls within 1 standard deviation of the mean, 95% within 2 standard deviations, and 99.7% within 3 standard deviations.

How is the Gaussian Distribution related to the Central Limit Theorem?

The Central Limit Theorem states that the sampling distribution of the mean of any independent, identically distributed random variable will be approximately normally distributed, regardless of the underlying distribution. This means that as the sample size increases, the distribution of sample means will approach a Gaussian Distribution.

What are some real-world applications of the Gaussian Distribution?

The Gaussian Distribution is commonly used in various fields such as statistics, physics, and finance. It is used to model natural phenomena such as the height and weight of individuals, as well as in engineering to model noise and measurement errors. In finance, it is used to model stock prices and returns. It is also used in machine learning and data science for data analysis and prediction.

How is the Gaussian Distribution different from other probability distributions?

The Gaussian Distribution is unique in that it is the most commonly occurring distribution in nature and can be used to approximate many other distributions. It is symmetrical and has a defined mean and standard deviation, unlike other distributions such as the Poisson or Chi-Square distribution. Additionally, it is continuous and can take on any value between negative infinity and positive infinity.

Similar threads

  • Calculus and Beyond Homework Help
2
Replies
47
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
936
  • Calculus and Beyond Homework Help
Replies
8
Views
669
  • Calculus and Beyond Homework Help
Replies
4
Views
142
  • Calculus and Beyond Homework Help
Replies
13
Views
489
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
31
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
964
  • Calculus and Beyond Homework Help
Replies
2
Views
959
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
Back
Top