What is the importance of conditional probability in probability theory?

Click For Summary

Discussion Overview

The discussion centers on the importance of conditional probability in probability theory, specifically examining a proof related to the properties of conditional probabilities and their uniqueness. Participants explore the implications of these properties and engage in clarifying the proof's details.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks clarification on a proof regarding conditional probabilities, specifically the "only if" part of the proposition that relates the properties of conditional probabilities to their mathematical formulation.
  • Another participant points out that if a conditional probability satisfies the properties, it must also satisfy the third property with a specific choice of events.
  • A participant questions the interpretation of a statement regarding the relationship between two conditional probabilities and expresses confusion about the notation used in the proof.
  • There is a suggestion that a potential typo exists in the notation, but another participant argues that the notation for probabilities and conditional probabilities is distinct and not erroneous.

Areas of Agreement / Disagreement

Participants express uncertainty regarding the proof's details and the notation used. There is no consensus on the interpretation of the statement in question, and multiple viewpoints on the clarity of the proof and its notation are present.

Contextual Notes

The discussion highlights potential ambiguities in notation and the assumptions underlying the properties of conditional probabilities. Participants are navigating the complexities of the proof without resolving all uncertainties.

Mogarrr
Messages
120
Reaction score
6
I'd like some help understanding a proof, from http://www.statlect.com/cndprb1.htm. Properties are introduced, which a conditional probability ought to have:
1) Must satisfy properties of probability measures:
a) for any event E, 0≤P(E)≤1;
b) P(Ω)=1;
c) Sigma-additivity: Let {E1, E2, ... En, ...} be a sequence of events, where i≠j implies Ei and Ej are mutually exclusive, then P(\bigcup_{n=1}^∞ E_n) = \sum_{n=1}^∞ P(E_n).​
2)P(I|I)=1
3) If E \subseteq I and F \subseteq I, and P(I) is greater than 0, then \frac {P(E|I)}{P(F|I)} = \frac {P(E)}{P(F)}.​

Then a proof is given for the proposition: Whenever P(I) is positive, P(E|I) satisfies the four properties above if and only if P(E|I) = \frac {P(E \cap I)}{P(I)}.

I'm having a hard time following the proof of the "only if" part. That is, if P(E|I) satisfies the four properties above, then P(E|I) = \frac {P(E \cap I)}{P(I)}.

Here's a quote:
Now we prove the 'only if' part. We prove it by contradiction. Suppose there exists another conditional probability \bar{P} that satifies the four properties. Then There Exists an even E such that:
\bar{P}(E|I) ≠ P(E|I)

It can not be noted that E \subseteq I, otherwise we would have:
\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {\bar{P}(E|I)}1 ≠ \frac {P(E|I)}1 = \frac {P(E \cap I)}{P(I)} = \frac {P(E)}{P(I)}

*which would be a contradiction, since if \bar{P} was a conditional probability, it would satisfy:
\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {P(E)}{P(I)}​

The proof by contradiction, seems more like a proof of the uniqueness of a conditional probability.

Anyways, I'm not really seeing the statement, *. How is it that \frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {P(E)}{P(I)}?
 
Physics news on Phys.org
For "only if", you assume that ##\bar P(E|I)## satisfies the four properties. In particular, it satisfies the third one with F=I.
 
OH...

So Property 3 asserts that: If E \subseteq I and F \subseteq I, where P(F) is positive, then for any probability A, \frac {A(E|I)}{A(F|I)} = \frac {P(E)}{P(F)}?

BTW, the other property was this: If E \subseteq I^C then P(E|I)=0.
 
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.
 
mfb said:
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.

Yes, I agree, but...

If \frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {\bar{P}(E)}{\bar{P}(I)}, then there is no contradiction.

So I'm looking for a justification for the statement *.

I'm fairly certain that I quoted the author correctly, but if there is doubt, you can always visit statlect.com. The notes on Conditional probability are in the Fundamentals of probability theory section.
 
Last edited:
mfb said:
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.


I think the notation for probabilities and conditional probabilities is not the same and therefore there is no typo.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 29 ·
Replies
29
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K