# Is P(A,B|C) = P(A|C) P(B|C), if P(A,B) = P(A)P(B)?

As stated in my subject line, I know that P(A|B) = P(A) and P(B|A) = P(B), i.e. A and B are separable as P(A,B) = P(A) P(B). I strongly suspect that this holds with a conditional added, but I can't find a way to formally prove it... can anyone prove this in a couple of lines via Bayes' rules? This is not a homework question, but part of my research and I can't find the answer anywhere.

Thanks to anyone who can help in advanced!
natski

Related Set Theory, Logic, Probability, Statistics News on Phys.org
No, this isn't true. Consider two fair coins flipped independently, let A be the event that the first coin comes up heads, B the event that the second coin comes up heads, and C be the event that at least one of the coins comes up heads. Then P(A) = P(B) = 1/2, P(A,B) = P(A)P(B) = 1/4, but P(A|C) = P(B|C) = 2/3 and P(A,B|C) = 1/3 $\neq$ P(A|C)P(B|C) = 4/9

• FactChecker
FactChecker
No, this isn't true. Consider two fair coins flipped independently, let A be the event that the first coin comes up heads, B the event that the second coin comes up heads, and C be the event that at least one of the coins comes up heads. Then P(A) = P(B) = 1/2, P(A,B) = P(A)P(B) = 1/4, but P(A|C) = P(B|C) = 2/3 and P(A,B|C) = 1/3 $\neq$ P(A|C)P(B|C) = 4/9