Is P(A,B|C) = P(A|C) P(B|C), if P(A,B) = P(A)P(B)?

Click For Summary
The discussion centers on the relationship between joint and conditional probabilities, specifically questioning if P(A,B|C) equals P(A|C)P(B|C) when P(A,B) equals P(A)P(B). It is clarified that this is not true through a counterexample involving two independent coin flips, where the probabilities do not align as suggested. The example illustrates that conditioning on event C creates a dependency between A and B, contradicting the initial assumption of independence. The conclusion emphasizes that the introduction of a condition can alter the relationship between the probabilities. Thus, the proposed equality does not hold under the specified conditions.
natski
Messages
262
Reaction score
2
As stated in my subject line, I know that P(A|B) = P(A) and P(B|A) = P(B), i.e. A and B are separable as P(A,B) = P(A) P(B). I strongly suspect that this holds with a conditional added, but I can't find a way to formally prove it... can anyone prove this in a couple of lines via Bayes' rules? This is not a homework question, but part of my research and I can't find the answer anywhere.

Thanks to anyone who can help in advanced!
natski
 
Physics news on Phys.org
No, this isn't true. Consider two fair coins flipped independently, let A be the event that the first coin comes up heads, B the event that the second coin comes up heads, and C be the event that at least one of the coins comes up heads. Then P(A) = P(B) = 1/2, P(A,B) = P(A)P(B) = 1/4, but P(A|C) = P(B|C) = 2/3 and P(A,B|C) = 1/3 \neq P(A|C)P(B|C) = 4/9
 
  • Like
Likes FactChecker
Citan Uzuki said:
No, this isn't true. Consider two fair coins flipped independently, let A be the event that the first coin comes up heads, B the event that the second coin comes up heads, and C be the event that at least one of the coins comes up heads. Then P(A) = P(B) = 1/2, P(A,B) = P(A)P(B) = 1/4, but P(A|C) = P(B|C) = 2/3 and P(A,B|C) = 1/3 \neq P(A|C)P(B|C) = 4/9
Even more obvious is C= exactly one coin is a head. Then the condition C forces a complete dependence between A and B.
 
Greetings, I am studying probability theory [non-measure theory] from a textbook. I stumbled to the topic stating that Cauchy Distribution has no moments. It was not proved, and I tried working it via direct calculation of the improper integral of E[X^n] for the case n=1. Anyhow, I wanted to generalize this without success. I stumbled upon this thread here: https://www.physicsforums.com/threads/how-to-prove-the-cauchy-distribution-has-no-moments.992416/ I really enjoyed the proof...

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 7 ·
Replies
7
Views
693
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 29 ·
Replies
29
Views
4K
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 31 ·
2
Replies
31
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K