MHB Probability calculation with Bayesian Networks

Click For Summary
The discussion focuses on calculating the probability $P(S | W)$ using Bayesian Networks, specifically through the formula $P(S = 1 | W = 1) = \frac{P(S = 1, W = 1)}{P(W = 1)}$. Participants emphasize the need to apply the chain rule and consider all permutations of variables C and R to compute the joint probabilities. The article clarifies that under the assumptions of independence, the joint probability can be expressed as a product of individual probabilities. An example calculation shows that $P(S=1, W=1)$ equals approximately 0.2781, demonstrating the application of these concepts in Bayesian inference. The discussion illustrates the complexity and methodical approach required for such probability calculations.
tmt1
Messages
230
Reaction score
0
Given this base data (taken from Graphical Models )$P(C) = 0.5$
$P(\lnot C) = 0.5$

$P(R | C) = 0.8$
$P(R | \lnot C) = 0.2$
$P(\lnot R | C) = 0.2$
$P(\lnot R | \lnot C) = 0.8$

$P(S | C) = 0.1$
$P(S | \lnot C) = 0.5$
$P( \lnot S | \lnot C) = 0.5$
$P( \lnot S | C) = 0.9$

$P(W | \lnot S, \lnot R) = 0.0$
$P(W | S, \lnot R) = 0.9$
$P(W | \lnot S, R) = 0.9$
$P(W | S, R) = 0.99$
$P(\lnot W | \lnot S, \lnot R) = 1.0$
$P(\lnot W | \ S, \lnot R) = 0.1$
$P(\lnot W | \lnot S, R) = 0.1$
$P(\lnot W | S, R) = 0.01$

Now, I need to calculate $P(S | W)$ or $P(S = 1 | W = 1)$ which is equal to

$\frac{P(S = 1, W = 1)}{P(W = 1)}$

or

$\frac{\sum_{c, r}^{} P(C = c, S = 1, R = r, W = 1)}{ P(W = 1)}$

I'm not sure how to begin calculating this, I think I have to use the chain rule though.

I think we need to find all the permutations of of c and r, which is these 4:

$P(R | C) = 0.8$
$P(R | \lnot C) = 0.2$
$P(\lnot R | C) = 0.2$
$P(\lnot R | \lnot C) = 0.8$

So in the first example, $P(R | C) = 0.8$, then $P(C) = 0.5$ and $P(R) = 0.8$ so for the first iteration of the sigma expression it would be

$P(C = 0.5, S = 1, R = 0.8, W = 1)$ and then I need to find the chain rule for this permutation? How would this be calculated?
 
Last edited:
Physics news on Phys.org
Hi tmt,

The article explains, with the assumed independence of R and S, and with the assumed independence of W and C, that:
$$P(C,S,R,W)=P(C)\,P(S|C)\,P(R|C)\,P(W|S,R)$$
And that:
\begin{array}{lcl}
P(S=1,W=1)
&=& \sum_{c,r} P(C=c, S=1, R=r, W=1) \\
&=& P(C=0, S=1, R=0, W=1) + P(C=0, S=1, R=1, W=1) \\
&& + P(C=1, S=1, R=0, W=1) + P(C=1, S=1, R=1, W=1) \\
&=& P(\lnot C, S, \lnot R, W) + ... \\
&=& P(\lnot C)\,P(S|\lnot C)\,P(\lnot R|\lnot C)\,P(W|S,\lnot R) + ... \\
&=& 0.5 \cdot 0.5 \cdot 0.8 \cdot 0.9 + ... \\
&=& 0.2781
\end{array}
 
Hello, I'm joining this forum to ask two questions which have nagged me for some time. They both are presumed obvious, yet don't make sense to me. Nobody will explain their positions, which is...uh...aka science. I also have a thread for the other question. But this one involves probability, known as the Monty Hall Problem. Please see any number of YouTube videos on this for an explanation, I'll leave it to them to explain it. I question the predicate of all those who answer this...