Mathman23
- 248
- 0
Hi
I have this here probability axiom which I'm not sure what I have understood correctly.
Let B_1 \ldots B_n be independent events
Then P(B_1 \mathrm{U} \ldots \mathrm{U} \ B_n) = 1 which is the same as
P(B_1) + P(B_2) + \ldots + P(B_n) = 1
I would like to show that this only is valid if 1 \leq k \leq n such that
P(B_k) = 1.
Proof:
If 1 \leq k \leq n, then P(E) = 1(where E is the probability space).
Thereby it follows that P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1
This can be written as the \sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1
Am I on the right track here?
Best Regards
Fred
I have this here probability axiom which I'm not sure what I have understood correctly.
Let B_1 \ldots B_n be independent events
Then P(B_1 \mathrm{U} \ldots \mathrm{U} \ B_n) = 1 which is the same as
P(B_1) + P(B_2) + \ldots + P(B_n) = 1
I would like to show that this only is valid if 1 \leq k \leq n such that
P(B_k) = 1.
Proof:
If 1 \leq k \leq n, then P(E) = 1(where E is the probability space).
Thereby it follows that P(B_1 U \ldots U \ B_n) = P(B_1) + P(B_2) + \ldots + P(B_n) = 1
This can be written as the \sum_{n=1} ^{k} P(B_{n+1}) = P(B_{k}) =1
Am I on the right track here?
Best Regards
Fred
Last edited: