Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Challenge 12b: Flipping Coins

  1. Dec 27, 2013 #1

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    A man has a single coin in front of him that lands on heads with probability p and tails with probability 1-p. He begins flipping coins with his single coin. If a coin lands on heads it is removed from the game and n new coins are placed on his stack of coins, if it lands on tails it is removed from the game and no new coins are introduced.

    What is the condition on n and p for the man to have a non-zero probability of playing forever?

    For an easier challenge check out Challenge 12a.

    This is a harder challenge than it may appear at first glance; points may be awarded for a mildly non-rigorous solution (and bonus points for a fully rigorous solution).
     
    Last edited: Dec 27, 2013
  2. jcsd
  3. Dec 27, 2013 #2

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    This is purely empirical, but it appears the condition is ##np>1##.
     
  4. Dec 27, 2013 #3
    Let n = 1 and p > 0 (clearly if n < 1 the game will last one round). If the time for one round is t then after time T he will have played T / t rounds. He will still be playing if and only if he has flipped heads every time with probability ## p^{T/t} > 0 ##. So the man may still be playing after any finite time T (i.e. for ever) with non-zero probability for any n ≥ 1 and p > 0.

    But I'm not sure that's the question you meant to ask - I think you want the man still to be playing after an infinite number of rounds.

    Can I suggest for 12c you allow 1 minute for the first round, 30s for the second, then 15s etc. What are the conditions on n and p that give a non-zero probability that the man has any coins left when the game is finished after 2 minutes?
     
    Last edited: Dec 27, 2013
  5. Dec 27, 2013 #4

    jbriggs444

    User Avatar
    Science Advisor

    It seems clear that ##np>=1## is a neccessary condition to have a non-zero probability of playing forever.

    Consider the situation as a drunkard's walk. At each step he has a probability p of taking n-1 steps to the right/positive and a probability 1-p of taking one step to the left/negative.

    After some number of steps, this will result in a distribution of possible positions with an easily calculable mean and standard deviation. This distribution is roughly normal. If the expectation of each trial is negative, then the expectation of the distribution becomes increasingly negative. This is linear with the number of trials. But the standard deviation increases as the square root of the number of trials. It would then follow that he tail of the distribution that is positive represents a probability that would become zero in the limit.

    It follows that the expectation of each trial must be non-negative to preserve a finite probability of playing forever.

    This is all ignoring the fact that a stack with negative coins is impossible. Or, equivalently, that there is a cliff to the left. That fact can only act to reduce the probability that the drunkard stays on the good side of the cliff. So it remains neccessary that the expectation of each trial be non-negative.

    The expectation of each trial is (n-1)*p - (1-p) = np - 1. This is positive when np > 1.

    I am unable to make much progress on demonstrating that this is a sufficient condition, however.
     
  6. Dec 27, 2013 #5

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I suspect we need the limit probability that the game lasts n rounds as as n goes to infinity to be greater than 0.
     
  7. Dec 27, 2013 #6

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Easily calculable if you ignore the cliff at 0. Taking that cliff into account makes the probability distribution much harder. This is more of a gambler's ruin problem than a drunkard's walk.
     
  8. Dec 27, 2013 #7
    Yes indeed, and the solution to gamblers' ruin is well known - the probability of eventual ruin converges almost surely to 1. However the solution does not say that ruin will happen in a finite time with probability 1 (or alternatively that ruin may not take forever).

    The problem here lies in the subtle differences between convergence in probability, almost sure convergence and the kind of limits we are used to dealing with in analysis. When we translate the words of a problem to a statement of probability theory so that we can prove or disprove that statement we need to be very sure that the translation is correct.
     
    Last edited: Dec 27, 2013
  9. Dec 27, 2013 #8

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Oh my. This exercise cost me a few hours of my life that I'd like back.

    I can't quite parse that. What we need is that the probability of going bust (playing and losing the last coin in the stack) is less than 1.

    Given a value of n, the only rolls on which the player will "go bust" are roll #1, roll #n+1, roll #2n+1, …, roll #kn+1, … where k is a non-negative integer. Denote Pn(k) as the probability of going bust on roll #kn+1. Then the condition Office_Shredder is looking for is the relation between n and p/i] that makes [itex]\sum_{k=0}^{\infty} P_n(k) < 1[/itex]. Easy!

    Going bust on roll #kn+1 means getting k wins and (n-1)k+1 losses without having gone bust on some previous roll. There are lots of ways to obtain this sum of wins and losses, each with probability pk(1-p)(n-1)k+1. One has to be careful in counting the possible paths. For example, the only way to go bust on roll #3 with n=2 is to win on the first roll and lose on the next two. The sequences lose-win-lose and lose-lose-win don't count because these sequences mean the player already busted out on roll #1. Without derivation, the probability of going bust on roll #kn+1 is
    [tex]P_n(k) = \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
    Summing these individual probabilities to yield the total probability of going bust after an infinite number of rolls yields
    [tex]P_n = \sum_{k=0}^{\infty} \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
    Once again without derivation, this sum is exactly one if [itex]np\le 1[/itex] and is less than one if [itex]np>1[/itex].

    The condition to have a non-zero probability of playing forever is [itex]np>1[/itex].
     
    Last edited: Dec 27, 2013
  10. Dec 27, 2013 #9
    If the player goes bust after an infinite number of rolls (taking an equal amount of time), how long has he played for?
     
  11. Dec 27, 2013 #10

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    The challenge says absolutely nothing about time. It asks for the probability of playing forever be non-zero. If the probability of ruin is almost surely one, the probability of playing forever is almost surely zero. In other words, it is not non-zero.
     
  12. Dec 27, 2013 #11
    How can you say that "playing forever" says nothing about time? Surely it means "playing for longer than any finite amount of time"?

    However if you want to, you could interpret "playing forever with probabilty greater than zero" to mean "playing for more rounds than any finite number of rounds with probability greater than zero". However what (I argue) you CANNOT do is equate "playing forever with probability greater than zero" with "play eventually terminating with probability almost surely converging to a value less than 1" - this is a much stronger criterion.
     
  13. Dec 27, 2013 #12

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    An anchovy taking on a big fish!
     
  14. Dec 27, 2013 #13

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The intended question is that if Pk is the probability of busting on the kth flip, finding
    [tex] \sum_{k\geq 1} P_k [/tex] and whether this number is equal to 1 or not.

    I don't think I understand your subtlety MrAnchovy. I thought this was equivalent to your suggestion for a 12c problem.

    DH, that was definitely not how I expected the problem to be solved. How do you calculate that sum?
     
    Last edited: Dec 27, 2013
  15. Dec 27, 2013 #14

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Let's cover the boring cases first:
    With n=0 or p=0, the man will always get rid of coins and stop with probability 1.
    With n=1 and p=1 the man will always stay at one coint and play forever.
    With n=1 and p<1, the man cannot gain coins and will eventually lose his coin. He stops with probability 1.
    With p=1 and n>0, the man cannot get rid of coins and will play forever.

    For the general case, consider n>1 and 0<p<1:
    Let r(k) be the probability to get run out of coins with an initial stack of k. Then ##r(k)=\left(r(1)\right)^k## as we have run "run out of coins" for each coin on the stack and this is independent of other coins.

    Clearly r(1) stays the same if we consider the result after one round. Therefore,
    $$r(1)=(1-p)+pr(n)=(1-p)+pr(1)^n$$
    Define R:=r(1):
    $$0=pR^n-R+(1-p)$$
    To find solutions, let
    $$f(R)=pR^n-R+(1-p)$$
    Clearly R=1 is a root of f and f(0)=1-p > 0.
    f''=n(n-1)R^(n-2) > 0 for R>0
    f cannot have more than two positive roots and we already found one, to have the second root between 0 and 1 we need f'(1)=np-1 > 0.

    Therefore, to have a non-zero probability of playing forever (R<1), the condition is np>1.
    This agrees with the idea that the expectation value of the game should be positive, as we have to "invest" 1 coin per round.


    By the way: in addition to the answer "is the probability non-zero?", this formula allows to calculate the probability.
     
    Last edited: Dec 27, 2013
  16. Dec 27, 2013 #15

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    You have the sense of p backwards. p is the probability of getting heads, which is what the player wants. p=1 means the player always wins, so he definitely plays forever. After fixing this, your condition is the same as mine: np>1.
     
  17. Dec 27, 2013 #16

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    That's exactly what I calculated. I noticed that P_k is identically zero if k≠nr+1 for some integer r and calculated P_r (which I called Pn(k)).


    You can't. What I wrote was nonsense (now corrected). I had a LaTeX typo in my expressions. Here are the corrected expressions:

    The probability of going bust on roll #kn+1 is
    [tex]P_n(k) = \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
    Summing these individual probabilities to yield the total probability of going bust after an infinite number of rolls yields
    [tex]P_n = \sum_{k=0}^{\infty} \frac 1 {(n-1)k+1} \binom{nk}k p^k (1-p)^{(n-1)k+1}[/tex]
     
  18. Dec 27, 2013 #17

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Oh, you are right. I misread heads/tails.
     
  19. Dec 27, 2013 #18
    Put simply my subtlety is that although the game may eventually terminate with probability 1 you may have to wait for more than any finite number of rounds (i.e. forever) for that to happen.

    Restating the problem as "What is the condition on n and p for the man to have a non-zero probability of not eventually losing all his coins?" avoids this problem but introduces an awkward double-negative, perhaps "What is the condition on n and p for the man to have a probability less than one of eventually losing all his coins?" is better.

    Edit: or (to match 12a) "Once he begins flipping, what is the condition on n and p such that the probability that he ever runs out of coins is less than one?"
     
  20. Dec 27, 2013 #19

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    You are making a mountain out of a molehill, an infinitesimally small molehill at that. Office_Shredder's ##\sum_{k=1}^{\infty} P_k=1## is shorthand for ##\forall \epsilon >0 \, \exists N \in I : \forall n>N \,\,|1 - \sum_{k=1}^n P_k|<\epsilon##. Pick a positive epsilon. No matter how close to zero you pick your epsilon, you will find a finite N such that the partial sums are within this epsilon of one. If this sum is one, the player almost surely will not play forever. The probability of an almost sure event is one. Not "almost one", one, period. The point of this challenge is to find the relationship between Office_Shredder's n and p such that marks the boundary between this series having a sum of one versus a sum less than one.
     
  21. Dec 27, 2013 #20

    Office_Shredder

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    As you have observed R=1 is a root, so you need to check that this is not the probability of playing forever (which is the tricky part I mention in the opening post).

    MrAnchovy, are you suggesting that he might run out of coins on flip [itex]\omega+1[/itex] (with omega the ordinality of the natural numbers) or later? If you want to attack the problem where he takes more than [itex] \omega[/itex] flips then go for it; it wasn't what I intended but is a valid enough interpretation I am interested in seeing what comes up for a proof.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Challenge 12b: Flipping Coins
  1. Coin flip paradox (Replies: 18)

  2. Flipping coins (Replies: 5)

  3. Coin flip question. (Replies: 3)

Loading...