What is the importance of conditional probability in probability theory?

In summary: However, in summary, the conversation discusses the properties that a conditional probability should satisfy, and a proof is given for the proposition that if a conditional probability satisfies these properties, then it must be equal to the probability of the intersection of the event and the condition divided by the probability of the condition. The conversation also discusses a proof by contradiction and a statement that is not clear.
  • #1
Mogarrr
120
6
I'd like some help understanding a proof, from http://www.statlect.com/cndprb1.htm. Properties are introduced, which a conditional probability ought to have:
1) Must satisfy properties of probability measures:
a) for any event E, 0≤P(E)≤1;
b) P(Ω)=1;
c) Sigma-additivity: Let {E1, E2, ... En, ...} be a sequence of events, where i≠j implies Ei and Ej are mutually exclusive, then P([itex]\bigcup_{n=1}^∞ E_n[/itex]) = [itex]\sum_{n=1}^∞ P(E_n)[/itex].​
2)P(I|I)=1
3) If [itex]E \subseteq I[/itex] and [itex]F \subseteq I[/itex], and P(I) is greater than 0, then [itex]\frac {P(E|I)}{P(F|I)} = \frac {P(E)}{P(F)}[/itex].​

Then a proof is given for the proposition: Whenever P(I) is positive, P(E|I) satisfies the four properties above if and only if P(E|I) = [itex]\frac {P(E \cap I)}{P(I)}[/itex].

I'm having a hard time following the proof of the "only if" part. That is, if P(E|I) satisfies the four properties above, then P(E|I) = [itex]\frac {P(E \cap I)}{P(I)}[/itex].

Here's a quote:
Now we prove the 'only if' part. We prove it by contradiction. Suppose there exists another conditional probability [itex]\bar{P}[/itex] that satifies the four properties. Then There Exists an even E such that:
[itex]\bar{P}(E|I) ≠ P(E|I)[/itex]

It can not be noted that [itex]E \subseteq I[/itex], otherwise we would have:
[itex]\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {\bar{P}(E|I)}1 ≠ \frac {P(E|I)}1 = \frac {P(E \cap I)}{P(I)} = \frac {P(E)}{P(I)} [/itex]

*which would be a contradiction, since if [itex]\bar{P}[/itex] was a conditional probability, it would satisfy:
[itex]\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {P(E)}{P(I)}[/itex]​

The proof by contradiction, seems more like a proof of the uniqueness of a conditional probability.

Anyways, I'm not really seeing the statement, *. How is it that [itex]\frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {P(E)}{P(I)}[/itex]?
 
Physics news on Phys.org
  • #2
For "only if", you assume that ##\bar P(E|I)## satisfies the four properties. In particular, it satisfies the third one with F=I.
 
  • #3
OH...

So Property 3 asserts that: If [itex]E \subseteq I[/itex] and [itex]F \subseteq I[/itex], where P(F) is positive, then for any probability A, [itex]\frac {A(E|I)}{A(F|I)} = \frac {P(E)}{P(F)}[/itex]?

BTW, the other property was this: If [itex]E \subseteq I^C[/itex] then P(E|I)=0.
 
  • #4
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.
 
  • #5
mfb said:
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.

Yes, I agree, but...

If [itex] \frac {\bar{P}(E|I)}{\bar{P}(I|I)} = \frac {\bar{P}(E)}{\bar{P}(I)} [/itex], then there is no contradiction.

So I'm looking for a justification for the statement *.

I'm fairly certain that I quoted the author correctly, but if there is doubt, you can always visit statlect.com. The notes on Conditional probability are in the Fundamentals of probability theory section.
 
Last edited:
  • #6
mfb said:
Oh, I thought the RHS would have bars as well, they are hard to see. Maybe a typo there and it should be ##\bar P##.


I think the notation for probabilities and conditional probabilities is not the same and therefore there is no typo.
 

1. What is conditional probability?

Conditional probability is the likelihood of an event occurring given that another event has already occurred. It takes into account the relationship between two events and how the probability of one event may change based on the occurrence of the other event.

2. How is conditional probability calculated?

The formula for conditional probability is P(A|B) = P(A ∩ B)/P(B), where P(A|B) represents the probability of event A occurring given that event B has occurred, P(A ∩ B) represents the probability of both events A and B occurring, and P(B) represents the probability of event B occurring.

3. What is the difference between conditional probability and joint probability?

Conditional probability takes into account the relationship between two events, while joint probability represents the probability of two events occurring together. Conditional probability is calculated using the joint probability of two events, but also takes into account the probability of one event occurring independently.

4. How is conditional probability used in real life?

Conditional probability is used in various fields, such as finance, medicine, and weather forecasting, to make predictions and assess risk. For example, in medicine, conditional probability can be used to determine the likelihood of a patient having a certain disease given their symptoms and medical history.

5. What are the limitations of conditional probability?

One limitation of conditional probability is that it assumes independence between events, meaning that the occurrence of one event does not affect the probability of the other event. In real life, events may not always be independent, which can lead to inaccurate predictions. Additionally, conditional probability can only be applied to events that have a well-defined relationship, and it may be difficult to accurately determine the probability of one event given the occurrence of another event in some situations.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
36
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
798
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
2
Replies
54
Views
3K
Back
Top