Understanding Bayesian Inference & Gaussian Distribution

  • Context: Graduate 
  • Thread starter Thread starter sensitive
  • Start date Start date
  • Tags Tags
    Bayesian
Click For Summary
SUMMARY

This discussion focuses on Bayesian Inference and Gaussian Distribution, emphasizing the importance of understanding Bayesian formalism and the derivation of parameters such as mean (μ) and variance in Gaussian distributions. Participants highlight that Bayesian inference involves combining prior beliefs with actual observations to make predictions, specifically through the calculation of the posterior mean as a weighted average of prior beliefs and data averages. Recommended resources for further reading include books that delve into these concepts in detail.

PREREQUISITES
  • Understanding of Bayesian formalism
  • Knowledge of Gaussian distribution parameters (mean and variance)
  • Familiarity with posterior distribution concepts
  • Basic statistics and probability theory
NEXT STEPS
  • Explore "Bayesian Inference" techniques and methodologies
  • Study "Gaussian Distribution" and its applications in statistics
  • Learn about "Posterior Distribution" and its derivation
  • Read recommended literature on Bayesian statistics, focusing on practical examples
USEFUL FOR

Statisticians, data scientists, researchers, and anyone interested in advanced statistical methods and Bayesian analysis.

sensitive
Messages
33
Reaction score
0
I am reading a topic on Bayesian Inference.I read books from different authors but they are all the same. I cannot see how the terms are derived.

Could anyone briefly explain what is going on and what is it that we are trying to find using this Bayesian. Bayesian is a combination of belief from past data. So I am thinking that we are making the prediction. But I am not sure of what sort of prediction. Is ther an example to this.

I need help in understanding the 1) Bayesian formalism and
2) the mean of a gaussian distribution - how are the parameters; mu and the variance are derived and how is the posterior distri bution derived as well.

By the way could anyone suggest any recommended books for this topic. Thx
 
Last edited:
Physics news on Phys.org
Kapur and Kesevan.
 
In practice Bayesian inference comes down to a weighting of prior beliefs vs. actual observations. I.e. the posterior mean is a weighted average of the prior beliefs (e.g. [itex]\mu_0[/itex]) and the data average.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K