What is the Probability of Running Out of Coins in a Flipping Game?

  • Thread starter Thread starter Office_Shredder
  • Start date Start date
  • Tags Tags
    Challenge
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
Messages
5,702
Reaction score
1,587
A man begins flipping coins according to the following rules: He starts with a single coin that he flips. Every time he flips a heads, he removes the coin from the game and then puts out two more coins to add to his stack of coins to flip, every time he flips a tails he simply removes the coin from the game.

For example, if he flips a heads on his first flip, he now has two coins. He flips both, if he gets two tails then he has run out of coins. On the other hand if he flipped two heads he would now have four coins to flip.

Once he begins flipping, what is the probability that he ever runs out of coins?

For a harder challenge check out Challenge 12b.
 
Mathematics news on Phys.org
So heads means +1, and tail means -1?
 
That's how I take it, Borek -- except presumably it's game over once he runs out of coins. If that's the case, the probability is 1 that he runs out of coins eventually.

The person will only run out of coins on odd numbered flips. It helps to look at just those odd numbered flips. The probability that the person is still in the game (hasn't run out of coins) after flip 2n+1 is \binom{2n+1}{n+1}\,2^{-(2n+1)}. This goes to zero as n→∞. In particular, it asymptotically approaches 1/\sqrt{n\pi}.
 
Here's an alternative solution.

Let P(n) be the probability of losing if you start with n coins. Let's start with 2 and toss both.

P(2) = \frac14 + \frac12 P(2) + \frac14 P(4)

And, in general:

P(2n) = \frac14 P(2n-2) + \frac12 P(2n) + \frac14 P(2n+2)

Hence:

P(2n+2) = 2P(2n) - P(2n-2)

By induction:

P(2n) = nP(2) - (n-1)

Hence:

P(2) = \frac1n P(2n) + \frac{n-1}{n}

Letting n → ∞ gives:

P(2) = 1

So, the probability of losing eventually if you start with 2 coins is 1. And, therefore, you're bound to lose if you start with 1 coin.
 
Last edited:
Nice inductive solution Perok!
 
In fact, as a corollary, starting from:

P(2n) = nP(2) - (n-1)

P(2n) = n - (n-1) = 1

So, it doesn't matter how many coins you start with, you're bound to lose eventually! Which, I guess, is intuitive if you know enough about probability.
 
PeroK said:
So, it doesn't matter how many coins you start with, you're bound to lose eventually!
Exactly. This is the gambler's ruin problem. The house presumably has an infinite supply of pennies in this challenge.
 
Just for completeness, using the brilliant observation by mfb on the harder problem that:

P(2) = P(1)^2

(Losing with two coins is equivalent to losing two independent games with one coin each.)

Then:

P(1) = \frac12 + \frac12 P(2) = \frac12 + \frac12 P(1)^2

∴ \ P(1)^2 - 2P(1) + 1 = 0 \ ∴ \ (P(1) - 1)^2 = 0 \ ∴ \ P(1) = 1
 
If consecutive coins in action have the P(H)=1 then the game never ends. So the answer is not unique unless a sequence of probabilities for the coins in action is defined. For P(H)=p (fixed and 0<p<1) the problem is a famous college problem as indicated by others. Of course p=0 and 1 are trivial cases.
 
Last edited:
  • #10
I note that, if for all coins P(H)=0.6 the the game never ends with probability = 1- (1-0.6)/0.6= 1-2/3= 1/3.
In fact P(H)≤ 0.5 => definite end of game. Else, there is a positive chance of a never ending game.
 
Last edited:
  • #11
Let ##p## be the probability that he eventually runs out of coins. There are two ways for this to happen. He can get tails on the first flip--this happens with probability ##\frac{1}{2}##. Alternatively with probability ##\frac{1}{2}## he can get heads on the first flip. If he gets heads on the first flip he essentially plays the same game twice, and the probability of running out of coins in *both* of these games is ##p^2##. So we have

##p = \frac{1}{2} + \frac{1}{2}p^2##

The solution to this equation is ##p=1##.
 
Back
Top