Tractability of posterior distributions

  • Context: Graduate 
  • Thread starter Thread starter pamparana
  • Start date Start date
  • Tags Tags
    Distributions
Click For Summary
SUMMARY

The discussion centers on the challenges of estimating posterior distributions, particularly when the prior is a multivariate Gaussian and the likelihood is expressed as a product of Gaussian likelihoods. Participants confirm that the posterior distribution P(θ|y) can indeed be Gaussian under these conditions, and that conditional and marginal distributions will also be Gaussian if the joint distribution is Gaussian. However, the complexity arises from the high dimensionality of θ and the difficulty in obtaining direct observations of θ from the data y, which complicates the optimization of the posterior parameters.

PREREQUISITES
  • Understanding of Bayesian statistics and posterior distributions
  • Familiarity with multivariate Gaussian distributions
  • Knowledge of likelihood functions and their properties
  • Experience with optimization techniques in high-dimensional spaces
NEXT STEPS
  • Study the properties of multivariate Gaussian distributions in depth
  • Explore Bayesian inference techniques for high-dimensional parameter spaces
  • Learn about optimization methods for estimating parameters of Gaussian distributions
  • Investigate the use of Markov Chain Monte Carlo (MCMC) methods for posterior estimation
USEFUL FOR

Statisticians, data scientists, and researchers involved in Bayesian modeling and inference, particularly those dealing with high-dimensional parameter estimation challenges.

pamparana
Messages
123
Reaction score
0
Hello,
I am trying to understand what makes estimating the posterior distribution such a hard problem.

So, imagine I need to estimate the posterior distribution over a set of parameters given the data y, so a quantity P(\theta|y) and \theta is generally high dimensional.

The prior over \theta is a multivariate Gaussian i.e. P(θ)∼N(θ;0,Σ)

The likelihood i.e. P(y|θ) can be written down as product over Gaussian likelihoods.

Now, it seems to be that the posterior distribution will also be Gaussian. Is that correct?

Secondly, going through Bishop's book, it seems that the conditional posterior distributions and the marginal distributions will be Gaussian as well (assuming that the joint distribution over the parameters and data is Gaussian) and should have a closed form solution. If that is the case, why is this problem intractable?

If I need to find the parameters of this posterior distribution, can this not be set as an optimisation problem where I estimate the mean and covariance of the posterior Gaussian? I am basically having trouble visualising why this problem is complicated?
 
Physics news on Phys.org
pamparana said:
If I need to find the parameters of this posterior distribution, can this not be set as an optimisation problem where I estimate the mean and covariance of the posterior Gaussian?

I don't understand how you will set up the problem. If we have a multivariate gaussian we can estimate its parameters from observations of the variates. If you have a Gaussian posterior distribution where the variables are \theta, are you assuming you have data that gives direct observations of \theta ?
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 8 ·
Replies
8
Views
5K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K