Is P(A,B|C) = P(A|C) P(B|C), if P(A,B) = P(A)P(B)?

AI Thread Summary
The discussion centers on the relationship between joint and conditional probabilities, specifically questioning if P(A,B|C) equals P(A|C)P(B|C) when P(A,B) equals P(A)P(B). It is clarified that this is not true through a counterexample involving two independent coin flips, where the probabilities do not align as suggested. The example illustrates that conditioning on event C creates a dependency between A and B, contradicting the initial assumption of independence. The conclusion emphasizes that the introduction of a condition can alter the relationship between the probabilities. Thus, the proposed equality does not hold under the specified conditions.
natski
Messages
262
Reaction score
2
As stated in my subject line, I know that P(A|B) = P(A) and P(B|A) = P(B), i.e. A and B are separable as P(A,B) = P(A) P(B). I strongly suspect that this holds with a conditional added, but I can't find a way to formally prove it... can anyone prove this in a couple of lines via Bayes' rules? This is not a homework question, but part of my research and I can't find the answer anywhere.

Thanks to anyone who can help in advanced!
natski
 
Physics news on Phys.org
No, this isn't true. Consider two fair coins flipped independently, let A be the event that the first coin comes up heads, B the event that the second coin comes up heads, and C be the event that at least one of the coins comes up heads. Then P(A) = P(B) = 1/2, P(A,B) = P(A)P(B) = 1/4, but P(A|C) = P(B|C) = 2/3 and P(A,B|C) = 1/3 \neq P(A|C)P(B|C) = 4/9
 
  • Like
Likes FactChecker
Citan Uzuki said:
No, this isn't true. Consider two fair coins flipped independently, let A be the event that the first coin comes up heads, B the event that the second coin comes up heads, and C be the event that at least one of the coins comes up heads. Then P(A) = P(B) = 1/2, P(A,B) = P(A)P(B) = 1/4, but P(A|C) = P(B|C) = 2/3 and P(A,B|C) = 1/3 \neq P(A|C)P(B|C) = 4/9
Even more obvious is C= exactly one coin is a head. Then the condition C forces a complete dependence between A and B.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top