# Consider the Gaussian Distribution...?

1. Sep 30, 2015

### RyanTAsher

1. The problem statement, all variables and given/known data

Consider the Gaussian Distribution

$p(x) = Ae^{-\lambda(x-a)^{2}}$,

where $A$, $a$, and $\lambda$ are constants. (Look up any integrals you need.)

(a) Determine $A$

(I only need help with this (a))

2. Relevant equations

$\int_{-\infty}^{\infty} p(x)dx = 1$

$\langle x \rangle = \int_{-\infty}^{\infty} xp(x)dx$

$\sigma^{2} \equiv \langle (\Delta x)^{2}) \rangle = \langle x^{2} \rangle - \langle x \rangle^{2}$

$\int e^{-ax^{2}} = \sqrt{\frac {\pi} {a}}$ erf$(x)$

$\int e^{ax^{2}} = -i \sqrt{\frac {\pi} {a}}$ erf$(x)$ (Where does the -i and sqrt come from?)

3. The attempt at a solution

$p(x) = Ae^{-\lambda(x-a)^{2}}$

$1 = A \int_{-\infty}^{\infty} e^{-\lambda(x-a)^{2}}dx$

$u = x-a, du=dx, u: -\infty$ to $\infty$

$1 = A \int_{-\infty}^{\infty} e^{-\lambda u^{2}}dx$

$1 = A[\sqrt{\frac {\pi} {\lambda}}$ erf$(x)$ $]_{-\infty}^{\infty}$

(stuck here... how to evaluate this...?)

Okay, so... I have a few questions here...

1) I know that $\int e^{ax^{2}}$ does not evaluate in terms of elementary functions, but an error function... but why does the $-i$ and $\sqrt{\frac {\pi} {a}}$ come into play, being multiplied by the erf(x)?

2) What exactly is a gaussian distribution, and what is the significance of it?

3) How do I evaluate the erf(x) function from $\infty$ to $\infty$. The book's solution shows it disappears away after they evaluate from $-\infty$ to $\infty$... ?

4) The answer for A in the book ends up being $A = \sqrt{\frac {\lambda} {\pi}}$, which I see how they got that, but where did the error function go to in the evaluation? Do we just consider it not there? Shouldn't it be $A = \sqrt{\frac {\lambda} {\pi}}$ erf$(x)$?

2. Sep 30, 2015

### Krylov

There is a textbook trick for determining the normalisation constant $A$. It involves evaluating the product of two normalisation integrals, one in the variable $x$ and one in the variable $y$, and a change to polar coordinates. There is no need to bother with the error function at any point. Maybe try to play with this hint a bit?

The Gaussian distribution is ubiquitous in probability, because in some precise sense (look up "central limit theorem" and "convergence in distribution" or "weak convergence" in a good probability textbook), the sum of $n$ independent and identically distributed random variables is approximately distributed according to a Gaussian, when $n$ is large. This explains, for example, why in the sciences certain deviations from a mean are often supposed to follow a Gaussian distribution, because they are assumed to be the cumulative effect of many (small) independent errors.

3. Sep 30, 2015

### Ray Vickson

What is preventing you from looking up 'Gaussian integral' in Google? All your questions will be answered!

4. Sep 30, 2015

### RyanTAsher

I did search this process, but I didnt really understand the normalization process with the two variables, and converting to polar coordinate.

I wanted to see if i could possibly evaluate it in terms of the erf(x) function, but it appears to all cancel out. So, my follow up question after searching the internet is what exactly makes this normalization process equivalent? I just dont understand how you can get the exponential from -x^2 to -(x^2+y^2)... i understand how e^-x^2*e^-y^2 gets us to there, but I dont understand why, or how we decide to put these two functions together.

5. Sep 30, 2015

### Ray Vickson

Just switch from rectangular coordinates $(x,y)$ to polar coordinates $(r, \theta)$. That trick works because you are integrating over the whole 2-dimensional plane, so you don't run into any difficult "boundary" issues. It would not work for a finite integration such as $I = \int_{x=a}^b e^{-x^2} \, dx$, because now $I^2$ would be a two-dimensional integration over a finite square in the plane, and switching to polar coordinates would give horrible boundary effects that are almost impossible to deal with.

The article http://www.math.uconn.edu/~kconrad/blurbs/analysis/gaussianintegral.pdf gives ten different proofs of the result, ranging from rather elementary to quite advanced.

Last edited: Sep 30, 2015