What is the importance of conditional probability in probability theory?

AI Thread Summary
Conditional probability is essential in probability theory as it establishes a framework for understanding how probabilities change when conditioned on specific events. The discussion highlights key properties that conditional probabilities must satisfy, including the properties of probability measures and the uniqueness of conditional probabilities. A proof by contradiction is presented to demonstrate that if a conditional probability satisfies these properties, it must equal the ratio of the intersection probability to the marginal probability. Participants express confusion over specific aspects of the proof, particularly regarding notation and the implications of the properties. Clarification is sought on the relationship between the conditional probabilities and the properties they must adhere to.
Mogarrr
Messages
120
Reaction score
6
I'd like some help understanding a proof, from http://www.statlect.com/cndprb1.htm. Properties are introduced, which a conditional probability ought to have:
1) Must satisfy properties of probability measures:
a) for any event E, 0≤P(E)≤1;
b) P(Ω)=1;
c) Sigma-additivity: Let {E1, E2, ... En, ...} be a sequence of events, where i≠j implies Ei and Ej are mutually exclusive, then P(\bigcup_{n=1}^∞ E_n) = \sum_{n=1}^∞ P(E_n).​
2)P(I|I)=1
3) If E \subseteq I and F \subseteq I, and P(I) is greater than 0, then \frac {P(E|I)}{P(F|I)} = \frac {P(E)}{P(F)}.​

Then a proof is given for the proposition: Whenever P(I) is positive, P(E|I) satisfies the four properties above if and only if P(E|I) = \frac {P(E \cap I)}{P(I)}.

I'm having a hard time following the proof of the "only if" part. That is, if P(E|I) satisfies the four properties above, then P(E|I) = \frac {P(E \cap I)}{P(I)}.

Here's a quote:
Now we prove the 'only if' part. We prove it by contradiction. Suppose there exists another conditional probability \bar{P} that satifies the four properties. Then There Exists an even E such that:
\bar{P}(E|I) ≠ P(E|I)

It can not be noted that E \subseteq I, otherwise we would have:
\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {\bar{P}(E|I)}1 ≠ \frac {P(E|I)}1 = \frac {P(E \cap I)}{P(I)} = \frac {P(E)}{P(I)}

*which would be a contradiction, since if \bar{P} was a conditional probability, it would satisfy:
\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {P(E)}{P(I)}​

The proof by contradiction, seems more like a proof of the uniqueness of a conditional probability.

Anyways, I'm not really seeing the statement, *. How is it that \frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {P(E)}{P(I)}?
 
Physics news on Phys.org
For "only if", you assume that ##\bar P(E|I)## satisfies the four properties. In particular, it satisfies the third one with F=I.
 
OH...

So Property 3 asserts that: If E \subseteq I and F \subseteq I, where P(F) is positive, then for any probability A, \frac {A(E|I)}{A(F|I)} = \frac {P(E)}{P(F)}?

BTW, the other property was this: If E \subseteq I^C then P(E|I)=0.
 
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.
 
mfb said:
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.

Yes, I agree, but...

If \frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {\bar{P}(E)}{\bar{P}(I)}, then there is no contradiction.

So I'm looking for a justification for the statement *.

I'm fairly certain that I quoted the author correctly, but if there is doubt, you can always visit statlect.com. The notes on Conditional probability are in the Fundamentals of probability theory section.
 
Last edited:
mfb said:
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.


I think the notation for probabilities and conditional probabilities is not the same and therefore there is no typo.
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top