Probability calculation with Bayesian Networks

Click For Summary
SUMMARY

This discussion focuses on calculating the probability $P(S | W)$ using Bayesian Networks, specifically with the provided probabilities for events C, R, S, and W. The calculation involves applying the chain rule and summing over all permutations of the variables. The final result for $P(S=1, W=1)$ is derived using the independence assumptions and results in a value of approximately 0.2781. The discussion emphasizes the importance of understanding conditional probabilities and independence in Bayesian analysis.

PREREQUISITES
  • Understanding of Bayesian Networks and their components
  • Familiarity with conditional probability and independence
  • Knowledge of the chain rule in probability theory
  • Experience with graphical models and their applications
NEXT STEPS
  • Study the application of the chain rule in Bayesian Networks
  • Learn about the independence assumptions in graphical models
  • Explore the use of software tools for Bayesian inference, such as PyMC3 or Stan
  • Investigate advanced topics in Bayesian statistics, including Markov Chain Monte Carlo (MCMC) methods
USEFUL FOR

Data scientists, statisticians, and researchers involved in probabilistic modeling and Bayesian analysis will benefit from this discussion, particularly those looking to deepen their understanding of Bayesian Networks and their practical applications.

tmt1
Messages
230
Reaction score
0
Given this base data (taken from Graphical Models )$P(C) = 0.5$
$P(\lnot C) = 0.5$

$P(R | C) = 0.8$
$P(R | \lnot C) = 0.2$
$P(\lnot R | C) = 0.2$
$P(\lnot R | \lnot C) = 0.8$

$P(S | C) = 0.1$
$P(S | \lnot C) = 0.5$
$P( \lnot S | \lnot C) = 0.5$
$P( \lnot S | C) = 0.9$

$P(W | \lnot S, \lnot R) = 0.0$
$P(W | S, \lnot R) = 0.9$
$P(W | \lnot S, R) = 0.9$
$P(W | S, R) = 0.99$
$P(\lnot W | \lnot S, \lnot R) = 1.0$
$P(\lnot W | \ S, \lnot R) = 0.1$
$P(\lnot W | \lnot S, R) = 0.1$
$P(\lnot W | S, R) = 0.01$

Now, I need to calculate $P(S | W)$ or $P(S = 1 | W = 1)$ which is equal to

$\frac{P(S = 1, W = 1)}{P(W = 1)}$

or

$\frac{\sum_{c, r}^{} P(C = c, S = 1, R = r, W = 1)}{ P(W = 1)}$

I'm not sure how to begin calculating this, I think I have to use the chain rule though.

I think we need to find all the permutations of of c and r, which is these 4:

$P(R | C) = 0.8$
$P(R | \lnot C) = 0.2$
$P(\lnot R | C) = 0.2$
$P(\lnot R | \lnot C) = 0.8$

So in the first example, $P(R | C) = 0.8$, then $P(C) = 0.5$ and $P(R) = 0.8$ so for the first iteration of the sigma expression it would be

$P(C = 0.5, S = 1, R = 0.8, W = 1)$ and then I need to find the chain rule for this permutation? How would this be calculated?
 
Last edited:
Physics news on Phys.org
Hi tmt,

The article explains, with the assumed independence of R and S, and with the assumed independence of W and C, that:
$$P(C,S,R,W)=P(C)\,P(S|C)\,P(R|C)\,P(W|S,R)$$
And that:
\begin{array}{lcl}
P(S=1,W=1)
&=& \sum_{c,r} P(C=c, S=1, R=r, W=1) \\
&=& P(C=0, S=1, R=0, W=1) + P(C=0, S=1, R=1, W=1) \\
&& + P(C=1, S=1, R=0, W=1) + P(C=1, S=1, R=1, W=1) \\
&=& P(\lnot C, S, \lnot R, W) + ... \\
&=& P(\lnot C)\,P(S|\lnot C)\,P(\lnot R|\lnot C)\,P(W|S,\lnot R) + ... \\
&=& 0.5 \cdot 0.5 \cdot 0.8 \cdot 0.9 + ... \\
&=& 0.2781
\end{array}
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 7 ·
Replies
7
Views
864
Replies
2
Views
4K
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K