I Gambler's dilemma: should you stop at small wins?

  • I
  • Thread starter Thread starter Happiness
  • Start date Start date
AI Thread Summary
The discussion centers on the gambler's dilemma regarding whether to stop at small wins or continue playing after reaching a target amount, B. It highlights that while the probability of winning remains constant, the risk of losing all money increases if one continues beyond twice the initial amount, A. The conversation also touches on the importance of setting a stopping point to avoid eventual losses, especially when competing against an opponent who can also choose to stop. The Kelly Criterion is mentioned as a strategy for determining optimal bet sizes to maximize bankroll growth, but it does not guarantee avoiding bankruptcy. Ultimately, the consensus suggests that having a clear quitting strategy is essential in gambling scenarios.
Happiness
Messages
686
Reaction score
30
Suppose the chance of winning and the chance of losing a game are both 0.5. You have $A at first, and the bet per game is $1. You stop playing when either you lose all your money (bad outcome) or when you reach $B, where B>A (good outcome). Then the chance of having the good outcome is ##\frac{A}{B}##.

$B is the amount of money you aim to get before you stop playing. Clearly, it doesn't make sense to have an aim of more than $2A because then it's more likely that you will lose all your money instead. So the best strategy is to stop at small wins.

However, once you reach $B, the probabilities get updated, just like the probability of getting another head from a coin toss is still 0.5 after getting 9 heads in a row. So you continue playing, telling yourself you would stop at small wins. But every time you reach your goal, the probabilities get updated, and you would always continue playing.

And if you always continue playing, it means you will go beyond $2A. But this we already showed earlier is not a wise choice.

So should you stop at small wins or not?
 
Last edited:
Physics news on Phys.org
If you reach B and continue, then you need to set yourself a stopping point at some point below B. Otherwise you will eventually lose it all. (Of course, if your opponent has a finite amount of money, you break it.)
 
Suppose you are starting with an infinite amount of money, or at least very big. You are betting always 1$ you said.

This is similar to tossing a fair coin: You are getting 1$ for heads -1$ for tails. For a large number of throws the probability settles to 50 % so in the end you are getting: "a" the money you started with plus "x" your wins minus "x" your losses equals "a", nothing.
 
The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
 
Happiness said:
The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
Same with Russian Roulette. Should you continue playing?
 
Rada Demorn said:
Same with Russian Roulette. Should you continue playing?

Sorry I don't get your point.
 
The point is if you keep on playing when you reach a goal, the question becomes who goes broke first, you or the bank?
 
Another thing to consider is whether your opponent also has the option of stopping when he is ahead. He may not allow you to keep playing in the hopes of eventually getting ahead. That allows him to make the game as though you go bankrupt immediately. I think that it forces you to have a policy of quitting immediately when you get ahead. Anything else gives him an advantage. He can play till he gets ahead and quit immediately. Unless you do the same, his expected winnings are positive.
 
Last edited:
What does 'should' mean ? the only reason to gamble with a zero (or less) expected value is entertainment. In an economic sense, the Kelly Criterion determines how much you should bet:
(From the Wikipedia entry)

For simple bets with two outcomes, one involving losing the entire amount bet, and the other involving winning the bet amount multiplied by the payoff odds, the Kelly bet is:

df0ce35820450429d2a76ea826663b84ab62d7aa

where:

  • f * is the fraction of the current bankroll to wager, i.e. how much to bet;
  • b is the net odds received on the wager ("b to 1"); that is, you could win $b (on top of getting back your $1 wagered) for a $1 bet
  • p is the probability of winning;
  • q is the probability of losing, which is 1 − p.
As an example, if a gamble has a 60% chance of winning (p = 0.60, q = 0.40), and the gambler receives 1-to-1 odds on a winning bet (b = 1), then the gambler should bet 20% of the bankroll at each opportunity (f* = 0.20), in order to maximize the long-run growth rate of the bankroll.
 
  • #10
Suppose the bet size is constant. What is the best winning strategy?

If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
 
  • #11
Happiness said:
If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
That is if you are playing against the house that can not end the game. If you are playing against another player who can end the game, that is not true.
 
  • #12
The Kelley Criterion does not guarantee you won’t go broke. There is always a minimum bet, either because it is set by the house or that money is not infinitely divisible. In the example above betting F* fraction of your bankroll with a minimum bet m you will go broke with number of consecutive losses n

(1-F*)^n * bankroll < m.
 
Back
Top