# Weird probability problem With what probability is it raining?

1. Oct 6, 2013

### AxiomOfChoice

Weird probability problem..."With what probability is it raining?"

Came across this the other day looking at interview questions.

Suppose you want to determine whether it's raining in a distant city. You have three friends there who you can call and ask about this. Only thing is:
• Each friend will tell the truth with probability 2/3.
• Each friend will lie with probability 1/3.
• The event that Friend $i$ lies is independent of the event that Friend $j$ lies for $1 \leq i,j \leq k$.
So, if you call all your friends, and they all tell you it's raining...what's the probability it's actually raining?

Here's a naive answer: The probability in question is just the probability that at least one of them is telling the truth, which by independence is $P(t_1 \cup t_2 \cup t_3) = 1 - P(\ell_1 \cap \ell_2 \cap \ell_3) = 1 - P(\ell_1)P(\ell_2)P(\ell_3) = 1 - (1/3)^3 = 26/27$. But there is a conceivable objection to this: You don't need at least one of them to be telling the truth; you need them ALL to be telling the truth. Because the probability just computed includes, for instance, the event that Friend 1 is telling the truth (it's raining), but Friend 2 and 3 are lying (it's not raining), which is incoherent. So, in a sense, the sample space used in the calculation above is too big! And what you should compute instead is $P(t_1 \cap t_2 \cap t_3) = P(t_1)P(t_2)P(t_3) = (2/3)^3 = 8/27$. But...that doesn't quite make sense...where's the remaining $1 - 1/27 - 8/27 = 2/3$ of our probability measure living?

2. Oct 6, 2013

### D H

Staff Emeritus

Obviously 8/27 is also incorrect.

The problem is that the 8/27 and 1/27 probabilities are the probability that one will obtain three "yes, it's raining" answers given that it is or is not raining. Well you obtained three yes answers. That's now a given. The prior probability of getting 3 yes answers was 1/3=9/27 (8/27+1/27). You want to know whether it's raining given this 3 yes answers. Using P(A|B)= P(A ∩ B)/P(B), the probability that it is raining given three yes answers is (8/27)/(9/27) = 8/9.

3. Oct 6, 2013

### Ibix

I like Bayes' Theorem for this:$$p(3\mathrm{yes}|\mathrm{rain})p(\mathrm{rain})=p(\mathrm{rain}|3\mathrm{yes})p(3\mathrm{yes})$$

The probability of three "yeses" given that it's actually raining is 8/27, as DH calculated.

The probability that it's raining is your estimate of how likely it is to be raining at any given moment. That's likely to be a higher number if your friends live in London than if they live in Karachi. Let's call it $p_r$

The probability of three "yeses" is the total probability:$$p(3\mathrm{yes}) = p(3\mathrm{yes} | \mathrm{rain}) p(\mathrm{rain}) + p(3\mathrm{yes} | \mathrm{no rain}) p(\mathrm{no rain}) =(1+8p_r)/27$$

You can substitute all that back into my first equation to get the remaining term, the probability that it's raining given that you got three "yeses". It's just:$$p(\mathrm{rain}|3\mathrm{yes})=\frac{8p_r}{1+7p_r}$$

Note that my beliefs about how wet it is typically affect my confidence in the honesty of the answers. Note also that if I believe that it rains 50% of the time, then I agree with DH.

Last edited by a moderator: Oct 6, 2013
4. Oct 6, 2013

### D H

Staff Emeritus
Aside:

Ibix, I fixed your LaTeX equations. In the future, please use some spaces in your LaTeX input. If you write "real" LaTeX it makes the LaTeX more readable. Spaces are just human readable noise when LaTeX/TeX is in math mode.

More importantly for this forum, the software that runs this forum doesn't like big honkin' words. You had a stream of 50+ characters, and that looked like a big honkin' word to the underlying software. Since the underlying software doesn't like big honkin' words, it inserts spaces, and it inevitably does so where it doesn't make sense to LaTeX.

5. Oct 6, 2013

### D H

Staff Emeritus
There's one big problem with Bayes' law for this: What if you don't have a clue regarding the prior probability? Bayes' law as is has a bit of a problem if there is no prior.

There is a very nice way to rewrite Bayes' law to account for this "I haven't the foggiest" prior probability. It's called an information filter. In a nutshell, an information filter is like a Kalman filter except that an information filter uses an information matrix rather than the covariance matrix. I don't know nuffin' about X has a nice representation as an information matrix: It's the zero matrix.

If you take this formulation, then the first friend who says "yes" yields a 2/3 probability that it is raining. You take this first friend at face value (where face value includes the fact that this friend might be lying) because the prior information matrix is the zero matrix. The second friend who says "yes" raises the probability to 4/5, and the third friend who says "yes" raises the probability to 8/9.

Alternatively, you can use an information filter formalism to bootstrap the process, and then use Bayes' law proper after the first friend says "yes".

6. Oct 6, 2013

### Ibix

Ah! I thought it was just MathJax having a bad day on my phone. Thanks for the fix and the explanation.

The information filter was interesting, too. Is the fact that it implies a 50/50 a priori probability (8/9 being what you get with pr=0.5) an artifact of the problem or of the filter?

7. Oct 6, 2013

### D H

Staff Emeritus
The complete lack of prior knowledge (which only makes sense in an information filter formation) is *sometimes* equivalent to the principle of indifference. In this case that happens to be the case. I'm not a huge fan of the principle of indifference. It can get you in big trouble. An information filter formalism (to me) gives a much better mechanism for expressing complete lack of prior knowledge.

8. Oct 22, 2013

### BTP

There is a problem here: these are still the conditional probabilities P(3y|R) = 8/27 and P(3y|~R) = 1/27 and so P(3y) ~= P(3y|R) + P(3y|~R) but P(3y)= P(3y|R)P(R) + P(3y|~R)P(~R). On cannot avoid the fact that there is incomplete information; Ibix is correct in the Bayesian approach that a assumption has to made for P(r).

Now one uses the maximum entropy principal in choosing P(R). Given one knows nothing about P(R), then one should choose P(R)=1/2 to maximize the entropy of the probability model on R,~R - entropy being a measure of uncertainty. The choice of probability distribution chosen should reflect the amount of uncertainty in your knowledge of the situation.

Last edited: Oct 22, 2013
9. Oct 23, 2013

### D H

Staff Emeritus
Not necessarily. You are assuming those are conditional probabilities. Look at them instead as marginal probabilities and there is no need for a prior for P(R).

You do not need a prior if you use an information filters. They provide an explicit mechanism for saying "I have no prior" (or, if you wish, "my prior is complete garbage"). It doesn't matter what you use for a prior if the information matrix is the zero matrix.

10. Oct 23, 2013

### BTP

Some clarifications:

Can it be view as a marginal distribution?

Let W= {R, ~R} (the weather) and let X= {T,F}^3 (the call answers) and P(X,W) be the joint distribution.

You cannot compute P(T^3, R) or P(T^3, ~R) because you nothing about the joint distribution even under the assumption of independence, though yes you can compute P(T^3)=P(T^3,W)=8/9=P(T^3)P(W) and P(F^3)=P(F^3,w)=1/9=P(F^3)P(W) , under the assumption of independence, and so treated as marginal probabilities.

But the key is that P(R|T^3)= P(R and T^3)/P(T^3) cannot be computed as P(R and T^3) cannot be computed.

------------