MHB Probability calculation with Bayesian Networks

tmt1
Messages
230
Reaction score
0
Given this base data (taken from Graphical Models )$P(C) = 0.5$
$P(\lnot C) = 0.5$

$P(R | C) = 0.8$
$P(R | \lnot C) = 0.2$
$P(\lnot R | C) = 0.2$
$P(\lnot R | \lnot C) = 0.8$

$P(S | C) = 0.1$
$P(S | \lnot C) = 0.5$
$P( \lnot S | \lnot C) = 0.5$
$P( \lnot S | C) = 0.9$

$P(W | \lnot S, \lnot R) = 0.0$
$P(W | S, \lnot R) = 0.9$
$P(W | \lnot S, R) = 0.9$
$P(W | S, R) = 0.99$
$P(\lnot W | \lnot S, \lnot R) = 1.0$
$P(\lnot W | \ S, \lnot R) = 0.1$
$P(\lnot W | \lnot S, R) = 0.1$
$P(\lnot W | S, R) = 0.01$

Now, I need to calculate $P(S | W)$ or $P(S = 1 | W = 1)$ which is equal to

$\frac{P(S = 1, W = 1)}{P(W = 1)}$

or

$\frac{\sum_{c, r}^{} P(C = c, S = 1, R = r, W = 1)}{ P(W = 1)}$

I'm not sure how to begin calculating this, I think I have to use the chain rule though.

I think we need to find all the permutations of of c and r, which is these 4:

$P(R | C) = 0.8$
$P(R | \lnot C) = 0.2$
$P(\lnot R | C) = 0.2$
$P(\lnot R | \lnot C) = 0.8$

So in the first example, $P(R | C) = 0.8$, then $P(C) = 0.5$ and $P(R) = 0.8$ so for the first iteration of the sigma expression it would be

$P(C = 0.5, S = 1, R = 0.8, W = 1)$ and then I need to find the chain rule for this permutation? How would this be calculated?
 
Last edited:
Physics news on Phys.org
Hi tmt,

The article explains, with the assumed independence of R and S, and with the assumed independence of W and C, that:
$$P(C,S,R,W)=P(C)\,P(S|C)\,P(R|C)\,P(W|S,R)$$
And that:
\begin{array}{lcl}
P(S=1,W=1)
&=& \sum_{c,r} P(C=c, S=1, R=r, W=1) \\
&=& P(C=0, S=1, R=0, W=1) + P(C=0, S=1, R=1, W=1) \\
&& + P(C=1, S=1, R=0, W=1) + P(C=1, S=1, R=1, W=1) \\
&=& P(\lnot C, S, \lnot R, W) + ... \\
&=& P(\lnot C)\,P(S|\lnot C)\,P(\lnot R|\lnot C)\,P(W|S,\lnot R) + ... \\
&=& 0.5 \cdot 0.5 \cdot 0.8 \cdot 0.9 + ... \\
&=& 0.2781
\end{array}
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top