Is the Probability Axiom Valid for Mutually Exclusive Events?

  • Thread starter Thread starter Mathman23
  • Start date Start date
  • Tags Tags
    Axiom Probability
Mathman23
Messages
248
Reaction score
0
Hi

I have this here probability axiom which I'm not sure what I have understood correctly.

Let B_1 \ldots B_n be independent events

Then P(B_1 \mathrm{U} \ldots \mathrm{U} \ B_n) = 1 which is the same as

P(B_1) + P(B_2) + \ldots + P(B_n) = 1

I would like to show that this only is valid if 1 \leq k \leq n such that

P(B_k) = 1.

Proof:

If 1 \leq k \leq n, then P(E) = 1(where E is the probability space).

Thereby it follows that P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1

This can be written as the \sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1

Am I on the right track here?

Best Regards
Fred
 
Last edited:
Physics news on Phys.org
I'm very confused by what you've written.


So I will just state some facts:

P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n)

is true when the B_i are disjoint -- this equation is usually false when they are independent.


If E is the event consisting of all possible outcomes, then P(E) = 1.

In fact, P(A) cannot be greater than 1 for any event A.
 
Hi Herkyl and thank You for Your answer,

I have looked at it again and come to the conclusion that the proof should have said:

Let B_1 \ldots B_n be independent events. Show that

P(B_1 \mathrm{U} \ldots \mathrm{U} B_n)= 1, if and only if there exists a number 1 \leq k \leq n, such that P(B_k) = 1.


Proof:

If 1 \leq k \leq n, then P(E) = 1(where E is the probability space).

Thereby it follows that P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1
Am I on the right path here now?
 
Your theorem is false. "Heads" and "Tails" are independent events, neither P("Heads") nor P("Tails") is 1, but P("Heads" U "Tails") = 1
 
No, heads and tails are not independent. P(Heads and Tails) is certainly unequal to P(Heads)*P(Tails).
 
Hello can I change my original theorem to make it true?

If Yes, how?

Sincerely Fred

Hurkyl said:
No, heads and tails are not independent. P(Heads and Tails) is certainly unequal to P(Heads)*P(Tails).
 
Okay, I guess I confused "independent" with "mutually exclusive".

If you have two independent events B, B', then:

P(B\cup B')

= P((B\cap B'^C) \sqcup (B\cap B') \sqcup (B' \cap B^C))

= P(B\cap B'^C) + P(B\cap B') + P(B' \cap B^C)

= P(B)P(B'^C) + P(B)P(B') + P(B')P(B^C)

= P(B)(1 - P(B')) + P(B)P(B') + P(B')(1 - P(B))

= 1 - [P(B) - 1][P(B') - 1]

If P(B\cup B') = 1, then [P(B)-1][P(B') - 1] = 0, so either P(B) = 1 or P(B') = 1.
 
Mathman23 said:
Hi

I have this here probability axiom which I'm not sure what I have understood correctly.

Let B_1 \ldots B_n be independent events

Then P(B_1 \mathrm{U} \ldots \mathrm{U} \ B_n) = 1 which is the same as

P(B_1) + P(B_2) + \ldots + P(B_n) = 1
If B_1, B_2, \ldots, B_n are mutually exclusive then
P(B_1 and B_2 and... and B_n)= P(B_1)+ P(B_2)+ \ldots + P(B_n)
follows from the definition of "mutually exclusive".
If, in addition, they exhaust all mutually exclusive events, then
P(B_1 and B_2 and... and B_n)= 1

I would like to show that this only is valid if 1 \leq k \leq n such that

P(B_k) = 1.

Proof:

If 1 \leq k \leq n, then P(E) = 1(where E is the probability space).

Thereby it follows that P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1

This can be written as the \sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1
? How does that follow from the above? If P(Bk)= 1 then it follows that P(Bi)= 0 for any i not equal to k and so
\sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1[/quote] follows trivially. But you are claiming the converse: if the sum of probabilities is 1 then the probability for each i except one is 1- and that is not, in general, true.

Am I on the right track here?

Best Regards
Fred
 
Back
Top