# Baysian Evidence approximation

1. Dec 10, 2004

### Machiveli

I'm using the laplace approximation (also known as MacKay's evidence framework) to the posterior volume of a baysian model.
http://www.cis.hut.fi/Opinnot/T-61.182/slides/ch24_28.pdf

The standard proceedure is as follows:

1) Find the (local) maximum point of the posterior pdf i.e optimise the parameter values.
2) Evaluate the hessian matrix(H) by a second order Taylor's series approximation for the logarithm of the posterior pdf
3) If one were to fit a (multidimentional) gaussian with mean given my the local maximum and variance given by then the normalising constant would be 2pi()*sqrt(det(inverse(H))) and this is therefore the approximate volume of the integral.

Clearly this is only possible if the hessian is non singular possitive definate i.e. we have found a local maximum not a saddle point. If its a saddle point we just do ridge regression. However in practical applications I've found that it doesnt invert for one of two reasons.

A)Maximum a posteriori parameter values are at the boundary of possible parameter values.
B)Some parameters are redundant or colinear. Therefore there is no curvature in the corresponding directions.

Now are the following acceptable solutions?

In case A we only care about the integral within our boundary. We therefore pretend the hessian is symmetric about the boundary approximate the pdf using the gaussian but this time with a definate integral.

In case B we remove dimentions corresponding to redundant parameters. (the first argument being that if the parameters are redundant they can take any value so their integral is one. The second argument being that the evidence for a model should be independant of how it is written i.e. y=w1*x is should have identical evidence to y=(w1+w2)*x if flat priors are given to the parameters)