Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Gambler's dilemma: should you stop at small wins?

  1. Nov 4, 2018 #1
    Suppose the chance of winning and the chance of losing a game are both 0.5. You have $A at first, and the bet per game is $1. You stop playing when either you lose all your money (bad outcome) or when you reach $B, where B>A (good outcome). Then the chance of having the good outcome is ##\frac{A}{B}##.

    $B is the amount of money you aim to get before you stop playing. Clearly, it doesn't make sense to have an aim of more than $2A because then it's more likely that you will lose all your money instead. So the best strategy is to stop at small wins.

    However, once you reach $B, the probabilities get updated, just like the probability of getting another head from a coin toss is still 0.5 after getting 9 heads in a row. So you continue playing, telling yourself you would stop at small wins. But every time you reach your goal, the probabilities get updated, and you would always continue playing.

    And if you always continue playing, it means you will go beyond $2A. But this we already showed earlier is not a wise choice.

    So should you stop at small wins or not?
     
    Last edited: Nov 4, 2018
  2. jcsd
  3. Nov 4, 2018 #2

    mathman

    User Avatar
    Science Advisor

    If you reach B and continue, then you need to set yourself a stopping point at some point below B. Otherwise you will eventually lose it all. (Of course, if your opponent has a finite amount of money, you break it.)
     
  4. Nov 5, 2018 #3
    Suppose you are starting with an infinite amount of money, or at least very big. You are betting always 1$ you said.

    This is similar to tossing a fair coin: You are getting 1$ for heads -1$ for tails. For a large number of throws the probability settles to 50 % so in the end you are getting: "a" the money you started with plus "x" your wins minus "x" your losses equals "a", nothing.
     
  5. Nov 5, 2018 #4
    The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

    The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
     
  6. Nov 5, 2018 #5
    Same with Russian Roulette. Should you continue playing?
     
  7. Nov 5, 2018 #6
    Sorry I don't get your point.
     
  8. Nov 5, 2018 #7

    mathman

    User Avatar
    Science Advisor

    The point is if you keep on playing when you reach a goal, the question becomes who goes broke first, you or the bank?
     
  9. Nov 5, 2018 #8

    FactChecker

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    Another thing to consider is whether your opponent also has the option of stopping when he is ahead. He may not allow you to keep playing in the hopes of eventually getting ahead. That allows him to make the game as though you go bankrupt immediately. I think that it forces you to have a policy of quitting immediately when you get ahead. Anything else gives him an advantage. He can play till he gets ahead and quit immediately. Unless you do the same, his expected winnings are positive.
     
    Last edited: Nov 9, 2018
  10. Nov 9, 2018 #9

    BWV

    User Avatar

    What does 'should' mean ? the only reason to gamble with a zero (or less) expected value is entertainment. In an economic sense, the Kelly Criterion determines how much you should bet:
    (From the Wikipedia entry)

    For simple bets with two outcomes, one involving losing the entire amount bet, and the other involving winning the bet amount multiplied by the payoff odds, the Kelly bet is:

    [​IMG]
    where:

    • f * is the fraction of the current bankroll to wager, i.e. how much to bet;
    • b is the net odds received on the wager ("b to 1"); that is, you could win $b (on top of getting back your $1 wagered) for a $1 bet
    • p is the probability of winning;
    • q is the probability of losing, which is 1 − p.
    As an example, if a gamble has a 60% chance of winning (p = 0.60, q = 0.40), and the gambler receives 1-to-1 odds on a winning bet (b = 1), then the gambler should bet 20% of the bankroll at each opportunity (f* = 0.20), in order to maximize the long-run growth rate of the bankroll.
     
  11. Nov 9, 2018 #10
    Suppose the bet size is constant. What is the best winning strategy?

    If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
     
  12. Nov 10, 2018 #11

    FactChecker

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    That is if you are playing against the house that can not end the game. If you are playing against another player who can end the game, that is not true.
     
  13. Nov 10, 2018 #12

    BWV

    User Avatar

    The Kelley Criterion does not guarantee you won’t go broke. There is always a minimum bet, either because it is set by the house or that money is not infinitely divisible. In the example above betting F* fraction of your bankroll with a minimum bet m you will go broke with number of consecutive losses n

    (1-F*)^n * bankroll < m.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted