# Probability Axiom Question

1. Apr 30, 2006

### Mathman23

Hi

I have this here probability axiom which I'm not sure what I have understood correctly.

Let $$B_1 \ldots B_n$$ be independent events

Then $$P(B_1 \mathrm{U} \ldots \mathrm{U} \ B_n) = 1$$ which is the same as

$$P(B_1) + P(B_2) + \ldots + P(B_n) = 1$$

I would like to show that this only is valid if $$1 \leq k \leq n$$ such that

P(B_k) = 1.

Proof:

If $$1 \leq k \leq n$$, then $$P(E) = 1$$(where E is the probability space).

Thereby it follows that $$P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1$$

This can be written as the $$\sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1$$

Am I on the right track here?

Best Regards
Fred

Last edited: Apr 30, 2006
2. Apr 30, 2006

### Hurkyl

Staff Emeritus
I'm very confused by what you've written.

So I will just state some facts:

$$P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n)$$

is true when the $B_i$ are disjoint -- this equation is usually false when they are independent.

If E is the event consisting of all possible outcomes, then P(E) = 1.

In fact, P(A) cannot be greater than 1 for any event A.

3. Apr 30, 2006

### Mathman23

I have looked at it again and come to the conclusion that the proof should have said:

Let B_1 \ldots B_n be independent events. Show that

P(B_1 \mathrm{U} \ldots \mathrm{U} B_n)= 1, if and only if there exists a number 1 \leq k \leq n, such that P(B_k) = 1.

Proof:

If $$1 \leq k \leq n$$, then $$P(E) = 1$$(where E is the probability space).

Thereby it follows that $$P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1$$
Am I on the right path here now?

4. Apr 30, 2006

### AKG

5. Apr 30, 2006

### Hurkyl

Staff Emeritus

6. Apr 30, 2006

### Mathman23

Hello can I change my original theorem to make it true?

If Yes, how?

Sincerely Fred

7. Apr 30, 2006

### AKG

Okay, I guess I confused "independent" with "mutually exclusive".

If you have two independent events B, B', then:

$$P(B\cup B')$$

$$= P((B\cap B'^C) \sqcup (B\cap B') \sqcup (B' \cap B^C))$$

$$= P(B\cap B'^C) + P(B\cap B') + P(B' \cap B^C)$$

$$= P(B)P(B'^C) + P(B)P(B') + P(B')P(B^C)$$

$$= P(B)(1 - P(B')) + P(B)P(B') + P(B')(1 - P(B))$$

$$= 1 - [P(B) - 1][P(B') - 1]$$

If $P(B\cup B') = 1$, then [P(B)-1][P(B') - 1] = 0, so either P(B) = 1 or P(B') = 1.

8. Apr 30, 2006

### HallsofIvy

Staff Emeritus
If $B_1, B_2, \ldots, B_n$ are mutually exclusive then
$P(B_1 and B_2 and... and B_n)= P(B_1)+ P(B_2)+ \ldots + P(B_n)$
follows from the definition of "mutually exclusive".
If, in addition, they exhaust all mutually exclusive events, then
$P(B_1 and B_2 and... and B_n)= 1$

? How does that follow from the above? If P(Bk)= 1 then it follows that P(Bi)= 0 for any i not equal to k and so
$\sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1$[/quote] follows trivially. But you are claiming the converse: if the sum of probabilities is 1 then the probability for each i except one is 1- and that is not, in general, true.