Challenge 12b: Flipping Coins

  • #1
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
4,621
629
A man has a single coin in front of him that lands on heads with probability p and tails with probability 1-p. He begins flipping coins with his single coin. If a coin lands on heads it is removed from the game and n new coins are placed on his stack of coins, if it lands on tails it is removed from the game and no new coins are introduced.

What is the condition on n and p for the man to have a non-zero probability of playing forever?

For an easier challenge check out Challenge 12a.

This is a harder challenge than it may appear at first glance; points may be awarded for a mildly non-rigorous solution (and bonus points for a fully rigorous solution).
 
Last edited:

Answers and Replies

  • #2
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
This is purely empirical, but it appears the condition is ##np>1##.
 
  • #3
pbuk
Science Advisor
Gold Member
2,331
1,054
Let n = 1 and p > 0 (clearly if n < 1 the game will last one round). If the time for one round is t then after time T he will have played T / t rounds. He will still be playing if and only if he has flipped heads every time with probability ## p^{T/t} > 0 ##. So the man may still be playing after any finite time T (i.e. for ever) with non-zero probability for any n ≥ 1 and p > 0.

But I'm not sure that's the question you meant to ask - I think you want the man still to be playing after an infinite number of rounds.

Can I suggest for 12c you allow 1 minute for the first round, 30s for the second, then 15s etc. What are the conditions on n and p that give a non-zero probability that the man has any coins left when the game is finished after 2 minutes?
 
Last edited:
  • #4
jbriggs444
Science Advisor
Homework Helper
9,916
4,511
This is purely empirical, but it appears the condition is ##np>1##.

It seems clear that ##np>=1## is a neccessary condition to have a non-zero probability of playing forever.

Consider the situation as a drunkard's walk. At each step he has a probability p of taking n-1 steps to the right/positive and a probability 1-p of taking one step to the left/negative.

After some number of steps, this will result in a distribution of possible positions with an easily calculable mean and standard deviation. This distribution is roughly normal. If the expectation of each trial is negative, then the expectation of the distribution becomes increasingly negative. This is linear with the number of trials. But the standard deviation increases as the square root of the number of trials. It would then follow that he tail of the distribution that is positive represents a probability that would become zero in the limit.

It follows that the expectation of each trial must be non-negative to preserve a finite probability of playing forever.

This is all ignoring the fact that a stack with negative coins is impossible. Or, equivalently, that there is a cliff to the left. That fact can only act to reduce the probability that the drunkard stays on the good side of the cliff. So it remains neccessary that the expectation of each trial be non-negative.

The expectation of each trial is (n-1)*p - (1-p) = np - 1. This is positive when np > 1.

I am unable to make much progress on demonstrating that this is a sufficient condition, however.
 
  • #5
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2020 Award
17,423
9,169
Let n = 1 and p > 0 (clearly if n < 1 the game will last one round). If the time for one round is t then after time T he will have played T / t rounds. He will still be playing if and only if he has flipped heads every time with probability ## p^{T/t} > 0 ##. So the man may still be playing after any finite time T (i.e. for ever) with non-zero probability for any n ≥ 1 and p > 0.

But I'm not sure that's the question you meant to ask - I think you want the man still to be playing after an infinite number of rounds.

Can I suggest for 12c you allow 1 minute for the first round, 30s for the second, then 15s etc. What are the conditions on n and p that give a non-zero probability that the man has any coins left when the game is finished after 2 minutes?

I suspect we need the limit probability that the game lasts n rounds as as n goes to infinity to be greater than 0.
 
  • #6
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
Consider the situation as a drunkard's walk. At each step he has a probability p of taking n-1 steps to the right/positive and a probability 1-p of taking one step to the left/negative. After some number of steps, this will result in a distribution of possible positions with an easily calculable mean and standard deviation.
Easily calculable if you ignore the cliff at 0. Taking that cliff into account makes the probability distribution much harder. This is more of a gambler's ruin problem than a drunkard's walk.
 
  • #7
pbuk
Science Advisor
Gold Member
2,331
1,054
Easily calculable if you ignore the cliff at 0. Taking that cliff into account makes the probability distribution much harder. This is more of a gambler's ruin problem than a drunkard's walk.

Yes indeed, and the solution to gamblers' ruin is well known - the probability of eventual ruin converges almost surely to 1. However the solution does not say that ruin will happen in a finite time with probability 1 (or alternatively that ruin may not take forever).

The problem here lies in the subtle differences between convergence in probability, almost sure convergence and the kind of limits we are used to dealing with in analysis. When we translate the words of a problem to a statement of probability theory so that we can prove or disprove that statement we need to be very sure that the translation is correct.
 
Last edited:
  • #8
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
Oh my. This exercise cost me a few hours of my life that I'd like back.

I suspect we need the limit probability that the game lasts n rounds as as n goes to infinity to be greater than 0.
I can't quite parse that. What we need is that the probability of going bust (playing and losing the last coin in the stack) is less than 1.

Given a value of n, the only rolls on which the player will "go bust" are roll #1, roll #n+1, roll #2n+1, …, roll #kn+1, … where k is a non-negative integer. Denote Pn(k) as the probability of going bust on roll #kn+1. Then the condition Office_Shredder is looking for is the relation between n and p/i] that makes [itex]\sum_{k=0}^{\infty} P_n(k) < 1[/itex]. Easy!

Going bust on roll #kn+1 means getting k wins and (n-1)k+1 losses without having gone bust on some previous roll. There are lots of ways to obtain this sum of wins and losses, each with probability pk(1-p)(n-1)k+1. One has to be careful in counting the possible paths. For example, the only way to go bust on roll #3 with n=2 is to win on the first roll and lose on the next two. The sequences lose-win-lose and lose-lose-win don't count because these sequences mean the player already busted out on roll #1. Without derivation, the probability of going bust on roll #kn+1 is
[tex]P_n(k) = \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
Summing these individual probabilities to yield the total probability of going bust after an infinite number of rolls yields
[tex]P_n = \sum_{k=0}^{\infty} \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
Once again without derivation, this sum is exactly one if [itex]np\le 1[/itex] and is less than one if [itex]np>1[/itex].

The condition to have a non-zero probability of playing forever is [itex]np>1[/itex].
 
Last edited:
  • #9
pbuk
Science Advisor
Gold Member
2,331
1,054
Summing these individual probabilities to yield the total probability of going bust after an infinite number of rolls...

If the player goes bust after an infinite number of rolls (taking an equal amount of time), how long has he played for?
 
  • #10
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
Yes indeed, and the solution to this is well known - the probability of eventual ruin converges almost surely to 1. However the solution does not say that ruin will happen in a finite time with probability 1.
The challenge says absolutely nothing about time. It asks for the probability of playing forever be non-zero. If the probability of ruin is almost surely one, the probability of playing forever is almost surely zero. In other words, it is not non-zero.
 
  • #11
pbuk
Science Advisor
Gold Member
2,331
1,054
The challenge says absolutely nothing about time. It asks for the probability of playing forever be non-zero. If the probability of ruin is almost surely one, the probability of playing forever is almost surely zero. In other words, it is not non-zero.

How can you say that "playing forever" says nothing about time? Surely it means "playing for longer than any finite amount of time"?

However if you want to, you could interpret "playing forever with probabilty greater than zero" to mean "playing for more rounds than any finite number of rounds with probability greater than zero". However what (I argue) you CANNOT do is equate "playing forever with probability greater than zero" with "play eventually terminating with probability almost surely converging to a value less than 1" - this is a much stronger criterion.
 
  • #12
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
2020 Award
17,423
9,169
An anchovy taking on a big fish!
 
  • #13
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
4,621
629
The intended question is that if Pk is the probability of busting on the kth flip, finding
[tex] \sum_{k\geq 1} P_k [/tex] and whether this number is equal to 1 or not.

I don't think I understand your subtlety MrAnchovy. I thought this was equivalent to your suggestion for a 12c problem.

DH, that was definitely not how I expected the problem to be solved. How do you calculate that sum?
 
Last edited:
  • #14
35,497
11,931
Let's cover the boring cases first:
With n=0 or p=0, the man will always get rid of coins and stop with probability 1.
With n=1 and p=1 the man will always stay at one coint and play forever.
With n=1 and p<1, the man cannot gain coins and will eventually lose his coin. He stops with probability 1.
With p=1 and n>0, the man cannot get rid of coins and will play forever.

For the general case, consider n>1 and 0<p<1:
Let r(k) be the probability to get run out of coins with an initial stack of k. Then ##r(k)=\left(r(1)\right)^k## as we have run "run out of coins" for each coin on the stack and this is independent of other coins.

Clearly r(1) stays the same if we consider the result after one round. Therefore,
$$r(1)=(1-p)+pr(n)=(1-p)+pr(1)^n$$
Define R:=r(1):
$$0=pR^n-R+(1-p)$$
To find solutions, let
$$f(R)=pR^n-R+(1-p)$$
Clearly R=1 is a root of f and f(0)=1-p > 0.
f''=n(n-1)R^(n-2) > 0 for R>0
f cannot have more than two positive roots and we already found one, to have the second root between 0 and 1 we need f'(1)=np-1 > 0.

Therefore, to have a non-zero probability of playing forever (R<1), the condition is np>1.
This agrees with the idea that the expectation value of the game should be positive, as we have to "invest" 1 coin per round.


By the way: in addition to the answer "is the probability non-zero?", this formula allows to calculate the probability.
 
Last edited:
  • Like
Likes 1 person
  • #15
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
Let's cover the boring cases first:
With n=0 or p=1, the man will always get rid of coins and stop with probability 1 … the condition is n(1-p)>1.
You have the sense of p backwards. p is the probability of getting heads, which is what the player wants. p=1 means the player always wins, so he definitely plays forever. After fixing this, your condition is the same as mine: np>1.
 
  • #16
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
The intended question is that if Pk is the probability of busting on the kth flip, finding
[tex] \sum_{k\geq 1} P_k [/tex] and whether this number is equal to 1 or not.
That's exactly what I calculated. I noticed that P_k is identically zero if k≠nr+1 for some integer r and calculated P_r (which I called Pn(k)).


DH, that was definitely not how I expected the problem to be solved. How do you calculate that sum?
You can't. What I wrote was nonsense (now corrected). I had a LaTeX typo in my expressions. Here are the corrected expressions:

The probability of going bust on roll #kn+1 is
[tex]P_n(k) = \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
Summing these individual probabilities to yield the total probability of going bust after an infinite number of rolls yields
[tex]P_n = \sum_{k=0}^{\infty} \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
 
  • #17
35,497
11,931
You have the sense of p backwards. p is the probability of getting heads, which is what the player wants. p=1 means the player always wins, so he definitely plays forever. After fixing this, your condition is the same as mine: np>1.
Oh, you are right. I misread heads/tails.
 
  • #18
pbuk
Science Advisor
Gold Member
2,331
1,054
I don't think I understand your subtlety MrAnchovy. I thought this was equivalent to your suggestion for a 12c problem.

Put simply my subtlety is that although the game may eventually terminate with probability 1 you may have to wait for more than any finite number of rounds (i.e. forever) for that to happen.

Restating the problem as "What is the condition on n and p for the man to have a non-zero probability of not eventually losing all his coins?" avoids this problem but introduces an awkward double-negative, perhaps "What is the condition on n and p for the man to have a probability less than one of eventually losing all his coins?" is better.

Edit: or (to match 12a) "Once he begins flipping, what is the condition on n and p such that the probability that he ever runs out of coins is less than one?"
 
  • #19
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
Put simply my subtlety is that although the game may eventually terminate with probability 1 you may have to wait for more than any finite number of rounds (i.e. forever) for that to happen.
You are making a mountain out of a molehill, an infinitesimally small molehill at that. Office_Shredder's ##\sum_{k=1}^{\infty} P_k=1## is shorthand for ##\forall \epsilon >0 \, \exists N \in I : \forall n>N \,\,|1 - \sum_{k=1}^n P_k|<\epsilon##. Pick a positive epsilon. No matter how close to zero you pick your epsilon, you will find a finite N such that the partial sums are within this epsilon of one. If this sum is one, the player almost surely will not play forever. The probability of an almost sure event is one. Not "almost one", one, period. The point of this challenge is to find the relationship between Office_Shredder's n and p such that marks the boundary between this series having a sum of one versus a sum less than one.
 
  • #20
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
4,621
629
Clearly r(1) stays the same if we consider the result after one round. Therefore,
$$r(1)=(1-p)+pr(n)=(1-p)+pr(1)^n$$
Define R:=r(1):
$$0=pR^n-R+(1-p)$$
To find solutions, let
$$f(R)=pR^n-R+(1-p)$$
Clearly R=1 is a root of f and f(0)=1-p > 0.
f''=n(n-1)R^(n-2) > 0 for R>0
f cannot have more than two positive roots and we already found one, to have the second root between 0 and 1 we need f'(1)=np-1 > 0.

As you have observed R=1 is a root, so you need to check that this is not the probability of playing forever (which is the tricky part I mention in the opening post).

MrAnchovy, are you suggesting that he might run out of coins on flip [itex]\omega+1[/itex] (with omega the ordinality of the natural numbers) or later? If you want to attack the problem where he takes more than [itex] \omega[/itex] flips then go for it; it wasn't what I intended but is a valid enough interpretation I am interested in seeing what comes up for a proof.
 
  • #21
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
The cases n=0 and n=1 are a bit pathological. In the case that n=0, this is easily dealt with. This is the "heads, you lose; tails I win" case. Regardless of the value of p, the player will lose, guaranteed, on the first flip.

In the case that n=1, there are only two outcomes: Playing forever with a stack of size 1, or almost surely losing in a finite number of flips. The former happens if p is identically equal to one. The latter happens for all other p (i.e., 0≤p<1). Note that this case changes the constraint from np>1 to np≥1. This is the only case where the strictly less than changes to less than or equal.


[tex]P_n = \sum_{k=0}^{\infty} \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
DH, that was definitely not how I expected the problem to be solved. How do you calculate that sum?
Wolfram Alpha can evaluate this sum for small n. With a little help getting rid of the superfluous nonsense WA is wont to add to expressions,
 
Last edited:
  • #22
35,497
11,931
As you have observed R=1 is a root, so you need to check that this is not the probability of playing forever (which is the tricky part I mention in the opening post).
R is the probability to get ruined, not the probability to play forever. R=1 as solution to the equation is a mathematical artifact (if I get ruined with every number of coins, the equation is still satisfied), but the correct root has f'<=0.
 
  • #23
pbuk
Science Advisor
Gold Member
2,331
1,054
You are making a mountain out of a molehill, an infinitesimally small molehill at that.

I agree that the molehill is infinitesimally small, however the challenge simply asks for a non-zero probability not a probability that is greater than some finite ## \epsilon ##.

Office_Shredder's ##\sum_{k=1}^{\infty} P_k=1## is shorthand for ##\forall \epsilon >0 \, \exists N \in I : \forall n>N \,\,|1 - \sum_{k=1}^n P_k|<\epsilon##. Pick a positive epsilon. No matter how close to zero you pick your epsilon, you will find a finite N such that the partial sums are within this epsilon of one.

Well in order to be rigorous it would be nice to first justify the use of a summation by showing that the sample space is at most countably infinite.

If this sum is one, the player almost surely will not play forever.

I agree that if this sum is one, the player almost surely (i.e. with probability 1) will run out of coins. However I do not agree that this implies that the player will, with probability one, play for a finite number of rounds. Indeed I demonstrated in post #3 that there is a non-zero probability that the player will still be playing after any finite number of rounds.

The point of this challenge is to find the relationship between Office_Shredder's n and p such that marks the boundary between this series having a sum of one versus a sum less than one.

I agree that this was the intention of the challenge, as was confirmed by Office_Shredder in post #, however that is not how the challenge was originally posed.

MrAnchovy, are you suggesting that he might run out of coins on flip [itex]\omega+1[/itex] (with omega the ordinality of the natural numbers) or later? If you want to attack the problem where he takes more than [itex] \omega[/itex] flips then go for it.

This is not what I am suggesting. There is an isomorphism between flips and the natural numbers so there are at most ## \omega ## flips. What I am suggesting is that if flips are counted separately then it will take forever to perform ## \omega ## flips.

You see by summing over a (countably) infinite number of terms you are by definition including the possibility that there will be a countably infinite number of flips, and there is an isomorphism between this number and the number of flips you can perform in "forever".

it wasn't what I intended but is a valid enough interpretation I am interested in seeing what comes up for a proof.

Let ## n = 1 ## and ## p > 0 ##. The probability that he is still playing after k rounds is ## p^k > 0 ##. So for any finite k there is a non-zero probability that he is still playing.

However there is a problem with this argument: if ## n=1 ## he must roll heads on every throw. There is therefore only one pattern among the uncountably infinite number of possibilities that will lead to him playing forever, and the probability of this is clearly zero.

I think I will modify my answer to ## n > 1 ## and ## p > 0 ##. There are now uncountably many ways to keep playing for ever, and the probability that he is still playing after k rounds is ## ≥ p^k + k p^{k-1}(1-p) + \binom 2 k p^{k-2}(1-p)^2 ... > 0 ##.
 
  • #24
D H
Staff Emeritus
Science Advisor
Insights Author
15,415
686
Well in order to be rigorous it would be nice to first justify the use of a summation by showing that the sample space is at most countably infinite.
Seriously? Office_Shredder's Pk is the probability of busting out on the kth flip. It is indexed by a positive integer, the coin flip number. You either flip the coin or you don't. There are no halfsies on coin flips, let alone irrational number of coin flips. Of course it's countable.


I agree that if this sum is one, the player almost surely (i.e. with probability 1) will run out of coins. However I do not agree that this implies that the player will, with probability one, play for a finite number of rounds.
I'll modify the rules of the game a bit to make it easier to analyze. What I want is a way to make sense of continuing to flip coins after the player has busted out.

Denote sk as the size of the player's stack after the kth coin flip, with an initial value of s0=1. After the kth coin flip, the stack size remains unchanged if sk=0. If sk is positive, the stack size increases by n-1 if the outcome of the kth was heads but decreases by one if the outcome was tails. Thus sk+1=0 if sk=0, regardless of the outcome of the flip. If sk>0, sk+1=sk+n-1 on heads, sk-1 on tails.

Now for some events. Denote bk as the set of all coin toss sequences of length k that result in going bust going bust on exactly the kth flip. Denote Bk as the set of all coin toss sequences of length k that resulted in going bust before the kth flip. Finally, denote Ak as denoting the set of all coin toss sequences of length k that result in sk+1>0.

Clearly, Bk, bk, and Ak are mutually exclusive events that collectively exhaust the sample space Ωk, the set of all possible coin toss sequences of length k. In other words, P(Ak)=1-(P(Bk)+P(bk)). If P(Bk)+P(bk)→1 as k→∞ then P(Ak) must become vanishingly small as k→∞.

It should be equally clear that bk2 bk1 are also mutually exclusive events for all k2k1. Thus P(Bk)=∑r<kP(br). Note that Office_Shredder's Pk is just P(bk), and also note that his sum is valid because of the mutual exclusivity of going bust on flip number k1 versus going bust on some other flip number k2.


I agree that this was the intention of the challenge, as was confirmed by Office_Shredder in post #, however that is not how the challenge was originally posed.
Yes, it is. You just read stuff into it that wasn't there.


I think I will modify my answer to ## n > 1 ## and ## p > 0 ##. There are now uncountably many ways to keep playing for ever, and the probability that he is still playing after k rounds is ## ≥ p^k + k p^{k-1}(1-p) + \binom 2 k p^{k-2}(1-p)^2 ... > 0 ##.
There are two things wrong here:
1. There are only a countable number of ways to keep playing forever.
2. Your probability is wrong.

I'll look at item #2 first. You are clearly using a binomial expansion. That is not correct. Look at what your expression says for the probability that the player is still playing after one round. Your expression evaluates to one. It evaluates to one for all k! You are ignoring the fact that the player stops playing after going bust (or in my alteration of the game, the play idly watches meaningless coin flips after going bust).

Regarding the countability of the sample space, first off, it doesn't matter if the coin toss sequences that form my sets bk, Bk, and Ak become uncountable in the limit k→∞. The sets themselves are mutually exclusive, and the set of all {bk, Bk, Ak} are countable. Secondly, there are only a countable number of infinitely long coin toss sequences that lead to the player playing forever. The sample space in the original problem is countable.

I'll use an ordered string comprising the symbols h and t to label a sequence of coin tosses of length k. For example, the string t designates tails (and hence going bust) on the very first toss, while h designates getting heads (and hence continuing the game). Note that the only string that starts with t is the string t itself. The string th is not a valid string, in any game with reward n. The string terminates when the player goes bust. In a valid string S, the number of ts in all initial proper substrings s of S is at most n-1 times the number of hs in s. (An initial proper substring s of a string S is a substring that starts at the first character of the string and ends before s ends.) What this means is that you can't use a diagonalization argument to prove uncountability. If the kth character of the kth string is an h, flipping that h to a t might mean game over. You have to keep that h if you want to continue your diagonalization argument, and now you can't guarantee that your generated string will be different from every string in my enumerated list.
 
  • #25
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
4,621
629
R is the probability to get ruined, not the probability to play forever. R=1 as solution to the equation is a mathematical artifact (if I get ruined with every number of coins, the equation is still satisfied), but the correct root has f'<=0.

I'm not sure I understand mathematically why you can claim that R=1 is not the correct solution.
 

Related Threads on Challenge 12b: Flipping Coins

  • Last Post
Replies
10
Views
4K
  • Last Post
Replies
5
Views
3K
Replies
4
Views
5K
Replies
4
Views
2K
  • Last Post
Replies
18
Views
10K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
5
Views
5K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
3
Views
3K
  • Last Post
Replies
14
Views
6K
Top