Mogarrr said:
I'm trying a few elementary counting problems, and a few are proving very difficult (for me). I have the answers and explanations, which I understand, so that's not the problem. I don't want to memorize answers. The problem is systematically analyzing these problems. My intuition is almost always wrong with counting, so I don't want to rely on any intutive explanations (adding to my frustration, most explanations use the intuitive approach). Here are some problems I found troubling:
My telephone rings 12 times each week, the calls being randomly distributed among the 7 days. What is the probability that I get at least one call each day?
If n balls are placed at random into n cells, find the probability that exactly one cell remains empty
A closet contains n pairs of shoes. If 2r shoes are chosen at random (2r<n), what is the probability that there will be no matching pair in the sample?
Given problems like these, how do you systematically work through the problem.
For the phone problem: as you say, the event D = {>=1 call each day} has complement E = {at least one day with 0 calls}, whose probability is (in this case) easier to determine.
Let ##E_i =## {0 calls on day i} for ##i = 1,2, \ldots, 7##. Then
E = \bigcup_{i=1}^7 E_i,
whose probability can be obtained from the inclusion-exclusion principle; see
http://en.wikipedia.org/wiki/Inclusion–exclusion_principle for the general idea and
http://en.wikipedia.org/wiki/Inclusion–exclusion_principle#In_probability for its specific application to probability.
I'll look at a simpler case of 3 days and 6 calls, with ##E_i =## {0 calls on day i}, ##i=1,2,3## and ##E = E_1 \cup E_2 \cup E_3##. Inclusion-exclusion says that
P(E) = S_1 - S_2 + S_3, \\<br />
\text{where}\\<br />
S_1 = \sum_i P(E_i) = P(E_1) + P(E_2) + P(E_3),\\<br />
S_2 = \sum_{i < j} P(E_i E_j) = P(E_1 E_2) + P(E_1 E_3) + P(E_2 E_3)\\<br />
S_3 = P(E_1 E_2 E_3).<br />
Here, I use the notation ##AB## instead of ##A \cap B## and ##ABC## instead of ##A \cap B \cap C##.
Note that ##P(E_1) = (2/3)^6## because for each call there is a 2/3 chance it does not fall on Day 1, and this must happen for all 6 calls. We also have ##P(E_2) = P(E_3) = (2/3)^6## (essentially by symmetry of the days) so ##S_1 = 3 P(E_1) = 3\,(2/3)^6##. Next, ##P(E_1 E_2) = (1/3)^6## because for each call there is a 1/3 chance it does not fall in Days 1 or 2, and that must happen for all 6 calls. By symmetry, all ##P(E_i E_j)## have the same value for any pairs ##(i,j)## of distinct days, so ##S_2 = N_2 P(E_1 E_2)##, where ##N_2## is the number of distinct pairs:
N_2 = {3 \choose 2} = \frac{3\times 2}{2!} = 3.
Thus, ##S_2 = 3\,(1/3)^6##.
Finally, ##S_3 = P(E_1 E_2 E_3) = 0## because the event that all three days receive 0 calls is impossible (i.e., non-existent).
Therefore, we have that
P(E) = 3\,(2/3)^6 - 3\, (1/3)^6 = 7/27
is the probability of at least one day with 0 calls.
Note: while I have used the inclusion-exclusion principle for probabilitities, it also applies to straight counting---that is, to determining cardinalities of event sets---so you can apply similar considerations to your problem if you insist on sticking with counting methods.