What is the difference between disjoint and independent events, how will the 2 affect calculations involving them?
Disjoint events are mutually exclusive, which is a strong form of statistical dependence (so if you know event A occurred you know that B definitely did not occur and vice versa), meaning [tex]P(A\cap B) = 0[/tex] For events to be independent on the other hand, knowing if one event occurred does not give you any information about if the other event occurred, which is generally expressed as [tex]P(A\cap B) = P(A) P(B)[/tex]
Ok, what about the union, i.e what aboue p(A U B ) if disjoint or independent, are we allowed to sum for both?
No, that only works for disjoint events, generally [tex]P(A\cup B) = P(A) + P(B) - P(A\cap B)[/tex] so for disjoint events the third term is zero.
Well, tossing a fair coin leads to series of events that are both independent and disjoint. I wouldn't say that fair coin tosses are in any way dependent on each other. In the usual sense, statistically independent events are not either/or outcomes. So for the tossing of a fair coin, the probability of H or T is exactly 1,and the third term is zero, not P= 1/4. However, as you say, if P(A) and P(B) are the probabilities of random independent events which are not mutually exclusive, then the sum of the probabilities is P(A)+P(B)-P(A)P(B); that is, the probability of A or B, less the probability of A and B. I was just concerned that your description of disjoint events as representing a strong form of dependence might be confusing to some.
Which events are meant to be both independent and disjoint in this case? The only way I can see that two events can ever be both independent and disjoint is if one of them has probability zero. The main events for the coin tosses are - the two events representing the results of each individual coin toss, which are disjoint, but not independent (since [itex]P(H \cap T) = 0 \ne P(H) P(T)[/itex]) - the results of subsequent coin tosses, which are independent, but not disjoint, if they are modelled as part of the same sigma algebra (P(first toss is H) doesn't exclude P(second toss is H) or P(second toss is T)) Or did I miss any events you considered? Of course, H and T don't have the kind of dependence you normally have with random variables, but can't see anything wrong with calling them dependent. All I meant by strong dependence (which maybe isn't a very well-defined concept as I used it) was that the probability of one event conditional on the other is zero for disjoint events, rather than just changing the original probability a bit as most types of dependence do.
Ahhh i seee, so if disjoint, they cannot both occur, so the intersection is the empty set and if they are independent the probability of the intersection is equal to the product of their probabilities. So when we have evens and we say the probability of their union is equal to their sum, does this mean they are independent or disjoint ?