# Basic Set Theory

1. Jul 12, 2006

### unztopable

I have some understanding problems with what the prof taught me today. I am just going to break it down and we can discuss, perhaps:

a. the sum of the collectively exhaustive events must equal 1.
I know that if an event is both collectively exhaustive and mutually exclusive it should cover the entire space and it's sum is 1. But if it's just collectively exhaustive, wouldn't there be a chance that it might overlap other events so making it not equal to 1?

b. if A and B are mutually exclusive, A(complement) and B(complement) are mutually exclusive.
I think this is not always true, because say A doesn't intersect with B, then the complements of both A and B should intersect. or I might be wrong in this.

c. If A and B are independent, then A(complement) and B(complement) are are also independent.

I really didn't get this one.

I hope somebody will be able to help me out with one at least if not all.

2. Jul 12, 2006

### mattmns

You are correct for b.; that is not a true statement. Take a simple counterexample. Let U = {1,2,3}, A={1}, B={2}, so A and B are mutually exclusive, but A' = {2,3} and B' = {1,3} which are not mutually exclusive since they both contain 3.

3. Jul 13, 2006

### 0rthodontist

In a. you're basically right, but first off in a case like this you should actually give a counterexample to it--an example where as you say the events overlap. Also regarding the "sum" of the events--it's more normal to talk about the sum of the _probabilities_ of the events. Events aren't even necessarily numbers.

For the third one, a little intuition could help you if you are familiar with the _idea_ of two events being independent. It means that "knowing something about whether one event holds tells you nothing about how likely it is that the other event holds." Of course it would be easy to infer whether an event happened based on whether its complement happened, so you would expect that c. is a true statement. To prove it, start by writing down the definition of statistical independence for two events A and B, which is
$$P(A)P(B) = P(A \cup B)$$
You want to show that

$$P(A^c)P(B^c) = P(A^c \cup B^c)$$
(where the c's denote complementation)

Last edited: Jul 13, 2006
4. Jul 13, 2006

### BoTemp

A really simple example for c. would be flipping two coins. The probability of an H on coin 1 (A) and and H on coin 2 (B)are independent (theoretically anyway). The complement of A, the probability of a T on coin 1, and the complement of B, prob of a T on coin 2, are also independent.

This is a pretty trivial example though; you'll notice A and Acomp are each independent of B and Bcomp.