How to incorporate evidence into parameters of a Bayesian network?

Click For Summary

Discussion Overview

The discussion revolves around the incorporation of evidence into the parameters of a Bayesian network, specifically focusing on how to update prior distributions based on observed evidence. Participants explore inference methods and the implications of using posterior distributions as priors for subsequent observations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant questions whether the posterior distribution after observing evidence can be used as the prior for the next observation, suggesting this aligns with Bayesian updating but expressing concern over potential complications.
  • Another participant mentions the necessity of creating conjugate priors during updates and notes that starting with uninformative priors complicates the process.
  • A participant expresses hesitation regarding the use of conjugacy, indicating their work may involve arbitrary distributions where conjugacy might not apply.
  • Adaptive inference in Bayesian networks is identified as a relevant term, with one participant sharing resources related to this concept.
  • A paper on adaptive Bayesian inference and a sum-product updating algorithm is referenced, although the participant admits to lacking experience in that specific area.

Areas of Agreement / Disagreement

Participants express varying levels of confidence regarding the use of posterior distributions as priors, and there is no consensus on the best approach for continuously updating parameters in Bayesian networks. The discussion remains unresolved with multiple competing views on the implications of conjugacy and adaptive inference.

Contextual Notes

Participants highlight potential limitations related to the use of conjugate priors and the challenges posed by arbitrary distributions, but these issues remain unresolved within the discussion.

sarikan
Messages
7
Reaction score
0
Greetings,
Maybe I'm getting a little bit confused, but I'm looking for resources which explain how to update parameters of a Bayesian network as a result of observations.

There are various inference methods, but unless I'm missing something here, these methods produce a posterior distribution based on evidence (a set of observations of some nodes).

This posterior distribution is specific to the evidence, but there must be a way of incorporating this into the network, so that the prior distribution of the following observations are modified.

Can I simply use the posterior distribution after observation of evidence E as the prior for the next? This should correspond to a Bayesian update of the network, but I fear that they me be a catch here. Would this be the right way of continuously updating the parameters of the network as evidence is observed?

Regards
 
Physics news on Phys.org
sarikan said:
Greetings,


Can I simply use the posterior distribution after observation of evidence E as the prior for the next? This should correspond to a Bayesian update of the network, but I fear that they me be a catch here. Would this be the right way of continuously updating the parameters of the network as evidence is observed?
Regards

This paper may be helpful. You generally need to create conjugate priors as you update. If you are starting with an uninformative prior or hyperparameters (of prior distributions) the process is more complicated.

http://cran.r-project.org/web/packages/LaplacesDemon/vignettes/BayesianInference.pdf
 
Last edited:
Thanks. A good overall paper it appears. I'll read it in detail. Conjugacy is useful, though I approach it with some hesitation, since my work may end up with arbitrary distributions, where conjugacy may not be possible.
I think I've found the right term for what I'm looking for by the way: adaptive inference in Bayesian networks. For anyone else who may look for something similar in the future...

Regards
 
sarikan said:
Thanks. A good overall paper it appears. I'll read it in detail. Conjugacy is useful, though I approach it with some hesitation, since my work may end up with arbitrary distributions, where conjugacy may not be possible.
I think I've found the right term for what I'm looking for by the way: adaptive inference in Bayesian networks. For anyone else who may look for something similar in the future...

Regards

Here's a paper on adaptive Bayesian inference based on tree structured networks. It describes the sum-product updating algorithm for re-evaluating marginal distributions and other quantities and proposes a faster alternative. I can't say I have any experience in this particular area of Bayesian applications, but it looks interesting.

http://people.cs.uchicago.edu/~osumer/nips07.pdf
 
Last edited:

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
Replies
2
Views
1K
  • · Replies 4 ·
Replies
4
Views
14K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K