Solving the Gaussian Integral for Variance of Gaussian Distribution

jaobyccdee
Messages
33
Reaction score
0
How to show that the variance of the gaussian distribution using the probability function? I don't know how to solve for ∫r^2 Exp(-2r^2/2c^2) dr .
 
Physics news on Phys.org
Use integration by parts and a substitution. It's really closely related to the integral of Exp(r^2).
 
Last edited:
I tried it. The probability function is 1/(sqrt(2Pi c^2)) * Exp[-r^2/2c] When integrate it from -infinity to infinity, the Exp[r^2] makes everything 0. But we are trying to proof that it's equal to c.
 
jaobyccdee said:
I tried it. The probability function is 1/(sqrt(2Pi c^2)) * Exp[-r^2/2c] When integrate it from -infinity to infinity, the Exp[r^2] makes everything 0. But we are trying to proof that it's equal to c.

Absolutely not: the integral of exp(-x^2) for x going from - infinity to + infinity is a finite, positive value (it is the area under the curve of the graph y = exp(-x^2)); furthermore, this integral can be found everywhere in books and web pages; I will let you find it.

Anyway, you need to find an integral of the form int_{x=-inf..inf} x^2*exp(-x^2) dx, which is obtained from yours by an appropriate change of variables, etc. Integrate by parts, setting u = x and dv = x*exp(-x^2) dx.

RGV
 
thx!:)
 
Back
Top