# Challenge 12a: Flipping Coins

1. Dec 27, 2013

### Office_Shredder

Staff Emeritus
A man begins flipping coins according to the following rules: He starts with a single coin that he flips. Every time he flips a heads, he removes the coin from the game and then puts out two more coins to add to his stack of coins to flip, every time he flips a tails he simply removes the coin from the game.

For example, if he flips a heads on his first flip, he now has two coins. He flips both, if he gets two tails then he has run out of coins. On the other hand if he flipped two heads he would now have four coins to flip.

Once he begins flipping, what is the probability that he ever runs out of coins?

For a harder challenge check out Challenge 12b.

2. Dec 27, 2013

### Staff: Mentor

So heads means +1, and tail means -1?

3. Dec 27, 2013

### D H

Staff Emeritus
That's how I take it, Borek -- except presumably it's game over once he runs out of coins. If that's the case, the probability is 1 that he runs out of coins eventually.

The person will only run out of coins on odd numbered flips. It helps to look at just those odd numbered flips. The probability that the person is still in the game (hasn't run out of coins) after flip $2n+1$ is $\binom{2n+1}{n+1}\,2^{-(2n+1)}$. This goes to zero as n→∞. In particular, it asymptotically approaches $1/\sqrt{n\pi}$.

4. Dec 27, 2013

### PeroK

Here's an alternative solution.

$P(2) = \frac14 + \frac12 P(2) + \frac14 P(4)$

And, in general:

$P(2n) = \frac14 P(2n-2) + \frac12 P(2n) + \frac14 P(2n+2)$

Hence:

$P(2n+2) = 2P(2n) - P(2n-2)$

By induction:

$P(2n) = nP(2) - (n-1)$

Hence:

$P(2) = \frac1n P(2n) + \frac{n-1}{n}$

Letting n → ∞ gives:

$P(2) = 1$

So, the probability of losing eventually if you start with 2 coins is 1. And, therefore, you're bound to lose if you start with 1 coin.

Last edited: Dec 27, 2013
5. Dec 27, 2013

### Office_Shredder

Staff Emeritus
Nice inductive solution Perok!

6. Dec 27, 2013

### PeroK

In fact, as a corollary, starting from:

$P(2n) = nP(2) - (n-1)$

$P(2n) = n - (n-1) = 1$

So, it doesn't matter how many coins you start with, you're bound to lose eventually! Which, I guess, is intuitive if you know enough about probability.

7. Dec 27, 2013

### D H

Staff Emeritus
Exactly. This is the gambler's ruin problem. The house presumably has an infinite supply of pennies in this challenge.

8. Dec 28, 2013

### PeroK

Just for completeness, using the brilliant observation by mfb on the harder problem that:

$P(2) = P(1)^2$

(Losing with two coins is equivalent to losing two independent games with one coin each.)

Then:

$P(1) = \frac12 + \frac12 P(2) = \frac12 + \frac12 P(1)^2$

$∴ \ P(1)^2 - 2P(1) + 1 = 0 \ ∴ \ (P(1) - 1)^2 = 0 \ ∴ \ P(1) = 1$

9. Jan 1, 2014

### ssd

If consecutive coins in action have the P(H)=1 then the game never ends. So the answer is not unique unless a sequence of probabilities for the coins in action is defined. For P(H)=p (fixed and 0<p<1) the problem is a famous college problem as indicated by others. Of course p=0 and 1 are trivial cases.

Last edited: Jan 1, 2014
10. Jan 1, 2014

### ssd

I note that, if for all coins P(H)=0.6 the the game never ends with probability = 1- (1-0.6)/0.6= 1-2/3= 1/3.
In fact P(H)≤ 0.5 => definite end of game. Else, there is a positive chance of a never ending game.

Last edited: Jan 1, 2014
11. Jan 3, 2014

### The_Duck

Let $p$ be the probability that he eventually runs out of coins. There are two ways for this to happen. He can get tails on the first flip--this happens with probability $\frac{1}{2}$. Alternatively with probability $\frac{1}{2}$ he can get heads on the first flip. If he gets heads on the first flip he essentially plays the same game twice, and the probability of running out of coins in *both* of these games is $p^2$. So we have

$p = \frac{1}{2} + \frac{1}{2}p^2$

The solution to this equation is $p=1$.