# Is there an analytical solution to the integral of a PDF?

## Main Question or Discussion Point

Hey all,

I have an engineering background with a pretty terrible grounding in statistics and just about all maths that I can't immediately visualize as some kind of dynamic system, so forgive me if this is an obvious question!

I am working through a set of derivations for a conditional probability problem and am a little stuck on a rigorous proof of the following...

$$\int_{\Re^{n}}\exp(-\frac{1}{2}A B A^T)dA = \frac{1}{|2\pi B|^{1/2}}$$

I can arrive at this solution by comparing the integral with that of the multivariate normal PDF which by definition is

$$\int_{\Re^{n}}\frac{1}{|2\pi \Sigma|^{1/2}}\exp(-\frac{1}{2}X \Sigma^{-1} X^T)dX = 1$$

(Substitute $$X = A$$ and $$\Sigma = B^{-1}$$ then cross multiply by everything outside of the exponential.)

Am I right in thinking that there is no analytical solution of this integral? It feels wrong to simply state it without proof, but if there is no solution how else can I present it?

Owen

Related Set Theory, Logic, Probability, Statistics News on Phys.org
EnumaElish
Homework Helper
Did you mean "integral" = |2 pi B|^(1/2), instead of |2 pi B|^(-1/2)?

I believe it should be ^(-1/2) as I'm substituting sigma with B^-1 and I'm fairly sure that 1/det(B^-1) = det(B) (although I cant remember the proof for this!). So the fraction inside the integral gets flipped on the LHS, making ^(-1/2) correct.

I'm pretty happy with the first equation, its the fact that I've used the 2nd equation to construct it and I'd like to prove the second equation, if this is even possible!

I'm pretty happy with the first equation, its the fact that I've used the 2nd equation to construct it and I'd like to prove the second equation, if this is even possible!
The Gaussian integral: $$\int_{-\infty}^{\infty} e ^{-x^2}dx=\sqrt{x}$$

An analytic solution of this is given here.
http://en.wikipedia.org/wiki/Gaussian_integral

When normalized, it gives the core function for the Gaussian PDF where $$(x-\mu)^2/2\sigma^2$$ replaces $$x^2$$.

Last edited:
EnumaElish
Homework Helper

I believe it should be ^(-1/2) as I'm substituting sigma with B^-1 and I'm fairly sure that 1/det(B^-1) = det(B) (although I cant remember the proof for this!). So the fraction inside the integral gets flipped on the LHS, making ^(-1/2) correct.
Okay, but then "2 pi" should be a divisor, not a multiplier of B.

Cheers for the replies.

EnumaElish, you are correct. I wrote this post from memory without referring to my notes. The 2pi is indeed a divisor!

SW VandeCarr, thats exactly what I was after. Thanks for the nudge in the right direction!

Cheers for the replies.

EnumaElish, you are correct. I wrote this post from memory without referring to my notes. The 2pi is indeed a divisor!

SW VandeCarr, thats exactly what I was after. Thanks for the nudge in the right direction!
You're welcome. This function and the closely related error function (erf) are about the most interesting in probability theory. They are non-elementary special functions.

Sorry. The Gaussian integral is: $$\int_{-\infty}^{\infty} e ^{-x^2}dx=\sqrt{\pi}$$.

Last edited:
The Gaussian integral can be evaluated by first squaring (to get a 2d integral) then changing to polar coordinates. To work out the nd version you'll need to convert to coordinates such that e.g. y'y = x'Bx (this assumes B is non-singular).

The Gaussian integral can be evaluated by first squaring (to get a 2d integral) then changing to polar coordinates. To work out the nd version you'll need to convert to coordinates such that e.g. y'y = x'Bx (this assumes B is non-singular).
Just want to second this point. If B can't be decomposed by Cholesky factorization, then the answer will not be so simple. Thankfully all covariance matrices are ok. So this is step 1. Then once you are converted to independent y's, its simply a product of gaussian integrals, that is step 2.