Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Challenge 12a: Flipping Coins

  1. Dec 27, 2013 #1

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    A man begins flipping coins according to the following rules: He starts with a single coin that he flips. Every time he flips a heads, he removes the coin from the game and then puts out two more coins to add to his stack of coins to flip, every time he flips a tails he simply removes the coin from the game.

    For example, if he flips a heads on his first flip, he now has two coins. He flips both, if he gets two tails then he has run out of coins. On the other hand if he flipped two heads he would now have four coins to flip.

    Once he begins flipping, what is the probability that he ever runs out of coins?

    For a harder challenge check out Challenge 12b.
     
  2. jcsd
  3. Dec 27, 2013 #2

    Borek

    User Avatar

    Staff: Mentor

    So heads means +1, and tail means -1?
     
  4. Dec 27, 2013 #3

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    That's how I take it, Borek -- except presumably it's game over once he runs out of coins. If that's the case, the probability is 1 that he runs out of coins eventually.

    The person will only run out of coins on odd numbered flips. It helps to look at just those odd numbered flips. The probability that the person is still in the game (hasn't run out of coins) after flip [itex]2n+1[/itex] is [itex]\binom{2n+1}{n+1}\,2^{-(2n+1)}[/itex]. This goes to zero as n→∞. In particular, it asymptotically approaches [itex]1/\sqrt{n\pi}[/itex].
     
  5. Dec 27, 2013 #4

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Here's an alternative solution.

    Let P(n) be the probability of losing if you start with n coins. Let's start with 2 and toss both.

    [itex]P(2) = \frac14 + \frac12 P(2) + \frac14 P(4)[/itex]

    And, in general:

    [itex]P(2n) = \frac14 P(2n-2) + \frac12 P(2n) + \frac14 P(2n+2)[/itex]

    Hence:

    [itex]P(2n+2) = 2P(2n) - P(2n-2)[/itex]

    By induction:

    [itex]P(2n) = nP(2) - (n-1)[/itex]

    Hence:

    [itex]P(2) = \frac1n P(2n) + \frac{n-1}{n}[/itex]

    Letting n → ∞ gives:

    [itex]P(2) = 1[/itex]

    So, the probability of losing eventually if you start with 2 coins is 1. And, therefore, you're bound to lose if you start with 1 coin.
     
    Last edited: Dec 27, 2013
  6. Dec 27, 2013 #5

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Nice inductive solution Perok!
     
  7. Dec 27, 2013 #6

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    In fact, as a corollary, starting from:

    [itex]P(2n) = nP(2) - (n-1)[/itex]

    [itex]P(2n) = n - (n-1) = 1[/itex]

    So, it doesn't matter how many coins you start with, you're bound to lose eventually! Which, I guess, is intuitive if you know enough about probability.
     
  8. Dec 27, 2013 #7

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Exactly. This is the gambler's ruin problem. The house presumably has an infinite supply of pennies in this challenge.
     
  9. Dec 28, 2013 #8

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Just for completeness, using the brilliant observation by mfb on the harder problem that:

    [itex]P(2) = P(1)^2[/itex]

    (Losing with two coins is equivalent to losing two independent games with one coin each.)

    Then:

    [itex]P(1) = \frac12 + \frac12 P(2) = \frac12 + \frac12 P(1)^2[/itex]

    [itex]∴ \ P(1)^2 - 2P(1) + 1 = 0 \ ∴ \ (P(1) - 1)^2 = 0 \ ∴ \ P(1) = 1[/itex]
     
  10. Jan 1, 2014 #9

    ssd

    User Avatar

    If consecutive coins in action have the P(H)=1 then the game never ends. So the answer is not unique unless a sequence of probabilities for the coins in action is defined. For P(H)=p (fixed and 0<p<1) the problem is a famous college problem as indicated by others. Of course p=0 and 1 are trivial cases.
     
    Last edited: Jan 1, 2014
  11. Jan 1, 2014 #10

    ssd

    User Avatar

    I note that, if for all coins P(H)=0.6 the the game never ends with probability = 1- (1-0.6)/0.6= 1-2/3= 1/3.
    In fact P(H)≤ 0.5 => definite end of game. Else, there is a positive chance of a never ending game.
     
    Last edited: Jan 1, 2014
  12. Jan 3, 2014 #11
    Let ##p## be the probability that he eventually runs out of coins. There are two ways for this to happen. He can get tails on the first flip--this happens with probability ##\frac{1}{2}##. Alternatively with probability ##\frac{1}{2}## he can get heads on the first flip. If he gets heads on the first flip he essentially plays the same game twice, and the probability of running out of coins in *both* of these games is ##p^2##. So we have

    ##p = \frac{1}{2} + \frac{1}{2}p^2##

    The solution to this equation is ##p=1##.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Challenge 12a: Flipping Coins
  1. Coin flip paradox (Replies: 18)

  2. Flipping coins (Replies: 5)

  3. Coin flip question. (Replies: 3)

Loading...