Probability of All Urns Having At Least One Ball | Simple Urn Problem

  • Thread starter JolleJ
  • Start date
In summary: M(s)}}\right]In summary, the probability that every urn has at least one ball in it after the M balls are placed is pi(n+1) = pi(n)(k-i)/k + pi+1(n)(i+1)/k. The probability that a given urn will receive no balls is pi(n+1) = 0. The probability that a given urn will receive no balls is pi(n+1) = pi(n)(k-i)/k + pi+1(n)(i+1)/k.
  • #1
JolleJ
35
0
You have k urns and M (M>k) balls.

Each ball is placed in a random urn (w/ uniform distrubution).

What is the probability that every urn has at least one ball in it after the M balls are placed?
 
Physics news on Phys.org
  • #2
What is the probability that a given urn will receive no balls?
 
  • #3
M Quack said:
What is the probability that a given urn will receive no balls?

That's not really the right approach because the urns are not independent and the OP is asking a question not about a given urn but about ALL the urns.
 
  • #4
Let pi(n) be the probability to have i empty urns at time n.

pi(n+1) = pi(n)(k-i)/k + pi+1(n)(i+1)/k

The initial condition is

pi(1) = δk-1

Seems like you have to solve this recurrence.
 
Last edited:
  • #5
EDIT: Err this was wrong
 
Last edited:
  • #6
Maybe this will work better

p0(M) = 1 - p1(M) - p2(M) - ... - pk-1(M)
= 1 - (k-M)((k choose 1) * (k-1)M + ( k choose 2) * (k-2)M + ... +(k choose k-1)1M)
 
Last edited:
  • #7
I know, I know, that was not supposed to be the complete answer. But you can build up from there. Following my route I get

[itex] \sum\limits_{n=1}^{k-1} (-1)^{n+1} \left(1-\frac{n}{k}\right)^M \left(\begin{array}{c}k\\n\end{array}\right)[/itex]

Kai, what do you get?
 
  • #8
I got

[tex] 1 - \sum_{i=1}^{k-1} \left(
\begin{array}{c}
k \\
i \\
\end{array}
\right) \left(\frac{k-i}{k}\right)^M[/tex]
 
  • #9
M Quack said:
I know, I know, that was not supposed to be the complete answer. But you can build up from there. Following my route I get

[itex] \sum\limits_{n=1}^{k-1} (-1)^{n+1} \left(1-\frac{n}{k}\right)^M \left(\begin{array}{c}k\\n\end{array}\right)[/itex]

Kai, what do you get?

Applying your formula to just the case k = 3 , we get

3(2/3)M - 3(1/3)M ≈ 3 (2 M )/(3 M )

But above doesn't go to 1. The probability that every urn has a ball must clearly go to 1 if you keep adding more and more balls.
 
Last edited:
  • #10
I been thinking sort of down the roads of what you're doing...
@Kai: Your suggestion gives negative results for m=100, k=50...
 
Last edited:
  • #11
Hmm, yeah I see my problem. I'm trying to condition on there being a specific number of empty urns, but I'm calculating the probability to have more than or equal to that many.
 
  • #12
The event whose probability is required is opposite to the event:
After M independent, equally probable fillings of k urns, there is at least one (meaning either one, or two, ... , or k) being empty.

What is the probability [itex]p_l[/itex] of exactly l urns remaining empty? (How many M letter words can be made with k distimct letters, and how many with k-l letters if you could repeat the letters and their order matters?)
 
  • #13
Dickfore said:
The event whose probability is required is opposite to the event:
After M independent, equally probable fillings of k urns, there is at least one (meaning either one, or two, ... , or k) being empty.
Yes, but I am not having an easier time calculating this.

Dickfore said:
What is the probability [itex]p_l[/itex] of exactly l urns remaining empty? (How many M letter words can be made with k distimct letters, and how many with k-l letters if you could repeat the letters and their order matters?)
Knowing this would obviously solve my problem; but this is an even harder question. Harder in the sense that my need (l = 0) is a subproblem.
 
  • #14
  • #15
Label the urns from 1 to k. Then, each placement of the balls is equivalent to an M-letter word with the "letters" 1, 2, ..., k. Each of these [itex]k^M[/itex] words is equally probable.

Let [itex]x(k, M)[/itex] be the number of M letter words in which each of the k letters of the alphabet occurs at least once.

We can divide this set into two disjoint subsets: A set in which 1 enters exactly once and in which it occurs more than once. The first subset has x(k - 1, M - 1) elements. The second x(k, M - 1). Therefore, we have the recursion:
[tex]
x(k, M) = x(k - 1, M - 1) + x(k, M - 1)
[/tex]

Obviously, [itex]x(k, M) = 0, \ k > M[/itex].

From here, we deduce that:
[tex]
x(k, k) = x(k - 1, k - 1) = \mathrm{const}.
[/tex]
This constant is obviously equal to 1, since there is only one way of filling k urns with k balls (by placing exactly one ball in each urn).

Then, we have:
[tex]
x(k, k + 1) = x(k - 1, k) + 1 \Rightarrow x(k, k + 1) = x(1, 2) + k - 1, \ k \ge 1
[/tex]
Obviously, [itex]x(1, M) = 1, M \ge 1[/itex]. This boundaty condition is consistent with the above recursion relation if and only if [itex]x(0, M) = 0, M \ge 0[/itex].

To reduce the recursion w.r.t. k, we perform a discrete Laplace transform w.r.t. k:
[tex]
X_M(s) \equiv = \sum_{k = 0}^{\infty}{x(k, M) e^{-k \, s}}
[/tex]
Multiply the recursion by [itex]e^{-(k - 1)s}[/itex] and sum w.r.t. k from 1 to ∞, to get:
[tex]
e^{s} \, \left[ X_M(s) - x(0, M) \right] = X_{M - 1}(s) + e^{s} \, \left[ X_{M - 1}(s) - x(0, M - 1) \right], \ M > 1
[/tex]

[tex]
X_{M + 1}(s) = (1 + e^{-s}) X_{M}(s), \ M \ge 1
[/tex]
This recursion w.r.t. M is a geometric sequence with an easy solution:
[tex]
X_M(s) = X_1(s) \, (1 + e^{-s})^{M - 1}, \ M \ge 1
[/tex]

[tex]
X_1(s) = \sum_{k = 0}^{\infty}{x(k, 1) \, e^{-k s}} = x(0, 1) + x(1, 1) e^{-s} = e^{-s}
[/tex]

Then, take [itex]z \equiv e^{s}[/itex], and we have:
[tex]
\tilde{X}_M(z) = X_M(s) = \frac{1}{z} \, \left(1 + \frac{1}{z} \right)^{M - 1}, \ M \ge 1
[/tex]

Then, if you expand this binomial into a Laurent series near [itex]z = 0[/itex], you can identify the coefficients as [itex]x(k, M)[/itex]. We have:
[tex]
\tilde{X}_M(z) = \frac{1}{z} \, \sum_{k = 0}^{M - 1}{\left( \begin{array}{c} M - 1 \\ k \end{array}\right) \, \frac{1}{z^k}} = \sum_{k = 1}^{M}{\left( \begin{array}{c} M - 1 \\ k - 1 \end{array}\right) \, \frac{1}{z^k}} \Rightarrow x(0, M) = 0, x(k, M) = \left( \begin{array}{c} M - 1 \\ k - 1 \end{array}\right), 1 \le k \le M, x(k, M) = 0, k > M
[/tex]

So, I think the answer is:
[tex]
P(k, M) = \left( \begin{array}{c} M - 1 \\ k - 1 \end{array}\right) \, \frac{1}{k^M}
[/tex]
 
  • #16
Dickfore said:
Obviously, [itex]x(k, M) = 0, \ k > M[/itex].

The OP defined M>k, but I prefer n=M, n>k as more standard notation for the binomial distribution.

Then we have n events with 1/k probability. The binomial distribution will have a non zero probability for zero events for any value of n given P=1/k. An event is a ball in an urn. The mean (expected) value would be n/k assuming all balls go into an urn.

EDIT: The probability that every urn has at least one ball should be 1-P(x=0). So for 20 balls and 10 urns P(x=0) = 0.1216 and 1-0.1216= 0.8784
 
Last edited:
  • #17
This is just a standard exercise from enumerative combinatorics.

Label the urns 1 through k. How many ways can we place the M balls in the k urns so that each urn gets a ball? Line up the balls and urns in a straight line, with lower-labeled urns to the left of higher-labeled urns and with the leftmost object being a ball, the rightmost object being an urn, and no two urns adjacent. Put all balls to the left of urn 1 into urn 1, then put all remaining balls to the left of the urn 2 into urn 2, and repeat until each ball is in an urn. There are M-1 choose k-1 ways to place the urns, since k-1 of the M-1 distinct spots between balls must be chosen for urns 1 through k-1, with urn k going on the very end.

Now how many total ways are there to put the balls in the urns? We repeat the process above, except we take away the requirements that the leftmost object be a ball (so that urn 1 can be empty) and that urns cannot be adjacent (so that for j>1 urn j can be made empty by putting it directly after urn j-1). We still need urn k to go last since otherwise we would not have placed every ball into an urn. There are M+k-1 choose k-1 ways to place the urns now, since urn k must be placed last, and before it are M+k-1 spots on which to place objects, exactly k-1 of which must be occupied by urns.

Thus the probability each urn contains a ball is (M-1 choose k-1)/(M+k-1 choose k-1).
 
  • #18
Thanks JCVD, that explanation is very clear.

kai_sikorski said:
Applying your formula to just the case k = 3 , we get

3(2/3)M - 3(1/3)M ≈ 3 (2 M )/(3 M )

But above doesn't go to 1. The probability that every urn has a ball must clearly go to 1 if you keep adding more and more balls.

Well spotted. I was trying to calculate the opposite probability, that there is at least one empty urn. But even that I got wrong.
 
Last edited:
  • #19
M Quack said:
Thanks JCVD, that explanation is very clear.
Well spotted. I was trying to calculate the opposite probability, that there is at least one empty urn. But even that I got wrong.

I got P= 0.8784 for n(M)=20, k=10 using the binomial distribution. Does anyone want the check JCVD's solution against this for these values?
 
Last edited:
  • #20
Thank you very much JCVD ! What a great way to do it.
 
  • #21
kai_sikorski said:
Applying your formula to just the case k = 3 , we get

3(2/3)M - 3(1/3)M ≈ 3 (2 M )/(3 M )

But above doesn't go to 1. The probability that every urn has a ball must clearly go to 1 if you keep adding more and more balls.

Why? Is there an upper limit on the number of balls that an urn can hold? If so, what is it (other than n itself)? In principle, there is no mathematical reason that the probability should equal 1 for any finite number of balls, although the probability can get arbitrarily close to one. This seems to be a fairly straightforward problem involving the binomial distribution. All one has to do is show that there is a non zero probability that exactly one urn may be empty after any finite n. Then the probability that all all urns have at least one ball is 1-p(x=0). It's not necessary to calculate the probability that at least one urn is empty after n balls. It should be clear that for n balls and k urns, the expected value for the number of balls in each urn is n/k which is the expectation of the binomial distribution when p=1/k.
 
Last edited:
  • #22
Pick anyone urn, which is empty at the start.

For each throw there is a 1/k chance that a ball will end up in this urn. So with each additional turn, the chance that the urn remains empty becomes smaller (1/k^M after M turns).

For an infinite number of throws that probability will go to zero. If for each individual urn the probability to remain empty is zero, then the probability of finding at least one empty bowl in the set is also zero.

In other words the probability that there is at least one ball in each urn increases with M and the limit M->infinity will be 100%.

Kai performed a simple sanity check on my formula and noted that it goes the wrong way. Therefore it has to be wrong. Well spotted, as I said.
 
  • #23
M Quack said:
Pick anyone urn, which is empty at the start.

For each throw there is a 1/k chance that a ball will end up in this urn. So with each additional turn, the chance that the urn remains empty becomes smaller (1/k^M after M turns).

For an infinite number of throws that probability will go to zero. If for each individual urn the probability to remain empty is zero, then the probability of finding at least one empty bowl in the set is also zero.

In other words the probability that there is at least one ball in each urn increases with M and the limit M->infinity will be 100%.

Kai performed a simple sanity check on my formula and noted that it goes the wrong way. Therefore it has to be wrong. Well spotted, as I said.

Your formula may have been incorrect, but your initial intuition that finding the probability that exactly one urn will be empty for any n was a correct approach. Note "at least one" is a higher probability than "exactly". The former would underestimate the probability that every urn gets at least one ball when computing 1-p(x=0). And of course, we are dealing with a finite mathematics problem here.
 
Last edited:

1. What is the probability of all urns having at least one ball in a simple urn problem?

The probability of all urns having at least one ball in a simple urn problem can be calculated by dividing the number of ways all urns can have at least one ball by the total number of possible outcomes. This can be written as P(at least one ball in all urns) = number of favorable outcomes / total number of outcomes.

2. How is the probability affected by the number of urns in a simple urn problem?

The probability is affected by the number of urns in a simple urn problem because as the number of urns increases, the number of possible outcomes also increases. This means that the probability of all urns having at least one ball decreases as the number of urns increases.

3. What is the significance of the simple urn problem in probability theory?

The simple urn problem is significant in probability theory because it is a basic example that helps to illustrate important concepts such as independent events and the multiplication rule. It also serves as a starting point for more complex probability problems.

4. Can the probability of all urns having at least one ball be greater than 1?

No, the probability of all urns having at least one ball cannot be greater than 1. This is because the probability of an event occurring can never be greater than the total number of possible outcomes.

5. How can the simple urn problem be applied in real-life situations?

The simple urn problem can be applied in real-life situations such as medical research, where it can be used to calculate the probability of all participants in a study receiving a certain treatment. It can also be used in market research to determine the probability of all target customers receiving a promotional offer.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
876
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
263
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Back
Top