What is the Probability of Running Out of Coins in a Flipping Game?

  • Context: Graduate 
  • Thread starter Thread starter Office_Shredder
  • Start date Start date
  • Tags Tags
    Challenge
Click For Summary

Discussion Overview

The discussion revolves around the probability of running out of coins in a coin flipping game, where the rules dictate that flipping heads adds more coins while flipping tails removes them. Participants explore various mathematical approaches and models to determine the likelihood of exhausting the coin supply, considering different starting conditions and probabilities associated with the flips.

Discussion Character

  • Exploratory
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant describes the game mechanics, stating that flipping heads adds coins while tails removes them, leading to a question about the probability of running out of coins.
  • Another participant suggests that if the game ends when coins run out, the probability of running out of coins is 1, particularly focusing on odd-numbered flips.
  • A different approach is proposed using a recursive function P(n) to express the probability of losing based on the number of starting coins, leading to a conclusion that P(2) = 1 as n approaches infinity.
  • Some participants agree that regardless of the starting number of coins, the probability of eventually losing is 1, relating it to the gambler's ruin problem.
  • One participant introduces a corollary that suggests the probability of losing is independent of the initial number of coins, reinforcing the idea that the player is bound to lose eventually.
  • Another participant discusses the implications of varying probabilities for heads, indicating that if the probability of heads is less than or equal to 0.5, the game will definitely end, while a probability greater than 0.5 allows for a chance of the game continuing indefinitely.
  • A mathematical formulation is presented where the probability of running out of coins is expressed as a function of the outcomes of the first flip, leading to a conclusion that p = 1.

Areas of Agreement / Disagreement

Participants generally agree that the probability of running out of coins is 1 under certain conditions, particularly when starting with 1 or 2 coins. However, there is disagreement regarding the implications of different probabilities for heads and how they affect the game's outcome, with some suggesting that a probability greater than 0.5 could lead to a never-ending game.

Contextual Notes

The discussion includes various assumptions about the probabilities of heads and tails, and how these affect the game's dynamics. There are unresolved mathematical steps in the recursive formulations presented, and the implications of different starting conditions and probabilities remain open to interpretation.

Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
Messages
5,706
Reaction score
1,592
A man begins flipping coins according to the following rules: He starts with a single coin that he flips. Every time he flips a heads, he removes the coin from the game and then puts out two more coins to add to his stack of coins to flip, every time he flips a tails he simply removes the coin from the game.

For example, if he flips a heads on his first flip, he now has two coins. He flips both, if he gets two tails then he has run out of coins. On the other hand if he flipped two heads he would now have four coins to flip.

Once he begins flipping, what is the probability that he ever runs out of coins?

For a harder challenge check out Challenge 12b.
 
Physics news on Phys.org
So heads means +1, and tail means -1?
 
That's how I take it, Borek -- except presumably it's game over once he runs out of coins. If that's the case, the probability is 1 that he runs out of coins eventually.

The person will only run out of coins on odd numbered flips. It helps to look at just those odd numbered flips. The probability that the person is still in the game (hasn't run out of coins) after flip 2n+1 is \binom{2n+1}{n+1}\,2^{-(2n+1)}. This goes to zero as n→∞. In particular, it asymptotically approaches 1/\sqrt{n\pi}.
 
Here's an alternative solution.

Let P(n) be the probability of losing if you start with n coins. Let's start with 2 and toss both.

P(2) = \frac14 + \frac12 P(2) + \frac14 P(4)

And, in general:

P(2n) = \frac14 P(2n-2) + \frac12 P(2n) + \frac14 P(2n+2)

Hence:

P(2n+2) = 2P(2n) - P(2n-2)

By induction:

P(2n) = nP(2) - (n-1)

Hence:

P(2) = \frac1n P(2n) + \frac{n-1}{n}

Letting n → ∞ gives:

P(2) = 1

So, the probability of losing eventually if you start with 2 coins is 1. And, therefore, you're bound to lose if you start with 1 coin.
 
Last edited:
Nice inductive solution Perok!
 
In fact, as a corollary, starting from:

P(2n) = nP(2) - (n-1)

P(2n) = n - (n-1) = 1

So, it doesn't matter how many coins you start with, you're bound to lose eventually! Which, I guess, is intuitive if you know enough about probability.
 
PeroK said:
So, it doesn't matter how many coins you start with, you're bound to lose eventually!
Exactly. This is the gambler's ruin problem. The house presumably has an infinite supply of pennies in this challenge.
 
Just for completeness, using the brilliant observation by mfb on the harder problem that:

P(2) = P(1)^2

(Losing with two coins is equivalent to losing two independent games with one coin each.)

Then:

P(1) = \frac12 + \frac12 P(2) = \frac12 + \frac12 P(1)^2

∴ \ P(1)^2 - 2P(1) + 1 = 0 \ ∴ \ (P(1) - 1)^2 = 0 \ ∴ \ P(1) = 1
 
If consecutive coins in action have the P(H)=1 then the game never ends. So the answer is not unique unless a sequence of probabilities for the coins in action is defined. For P(H)=p (fixed and 0<p<1) the problem is a famous college problem as indicated by others. Of course p=0 and 1 are trivial cases.
 
Last edited:
  • #10
I note that, if for all coins P(H)=0.6 the the game never ends with probability = 1- (1-0.6)/0.6= 1-2/3= 1/3.
In fact P(H)≤ 0.5 => definite end of game. Else, there is a positive chance of a never ending game.
 
Last edited:
  • #11
Let ##p## be the probability that he eventually runs out of coins. There are two ways for this to happen. He can get tails on the first flip--this happens with probability ##\frac{1}{2}##. Alternatively with probability ##\frac{1}{2}## he can get heads on the first flip. If he gets heads on the first flip he essentially plays the same game twice, and the probability of running out of coins in *both* of these games is ##p^2##. So we have

##p = \frac{1}{2} + \frac{1}{2}p^2##

The solution to this equation is ##p=1##.
 

Similar threads

  • · Replies 57 ·
2
Replies
57
Views
7K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 27 ·
Replies
27
Views
10K
  • · Replies 45 ·
2
Replies
45
Views
6K
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 50 ·
2
Replies
50
Views
16K
  • · Replies 20 ·
Replies
20
Views
6K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K