Gambler's dilemma: should you stop at small wins?

  • Context: Undergrad 
  • Thread starter Thread starter Happiness
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on the Gambler's Dilemma, specifically the strategy of stopping at small wins versus continuing to play after reaching a target amount, B, which should not exceed 2A, where A is the initial bankroll. Participants emphasize that while the probability of winning remains constant, the decision to continue playing after reaching B can lead to significant losses. The Kelly Criterion is introduced as a method for determining optimal bet sizes, suggesting that a gambler should wager a fraction of their bankroll based on the odds and probabilities of winning. Ultimately, the consensus is that setting a stopping point below B is crucial to avoid losing all funds.

PREREQUISITES
  • Understanding of basic probability concepts, particularly in gambling scenarios.
  • Familiarity with the Kelly Criterion for optimal betting strategies.
  • Knowledge of bankroll management in gambling contexts.
  • Basic grasp of game theory as it applies to competitive gambling.
NEXT STEPS
  • Research the Kelly Criterion in-depth to understand its application in various betting scenarios.
  • Explore advanced probability theory related to gambling outcomes and risk assessment.
  • Study bankroll management techniques to optimize gambling strategies and minimize losses.
  • Analyze case studies of gambling strategies that incorporate stopping rules and their effectiveness.
USEFUL FOR

This discussion is beneficial for gamblers, game theorists, and financial analysts interested in risk management and decision-making strategies in uncertain environments.

Happiness
Messages
686
Reaction score
30
Suppose the chance of winning and the chance of losing a game are both 0.5. You have $A at first, and the bet per game is $1. You stop playing when either you lose all your money (bad outcome) or when you reach $B, where B>A (good outcome). Then the chance of having the good outcome is ##\frac{A}{B}##.

$B is the amount of money you aim to get before you stop playing. Clearly, it doesn't make sense to have an aim of more than $2A because then it's more likely that you will lose all your money instead. So the best strategy is to stop at small wins.

However, once you reach $B, the probabilities get updated, just like the probability of getting another head from a coin toss is still 0.5 after getting 9 heads in a row. So you continue playing, telling yourself you would stop at small wins. But every time you reach your goal, the probabilities get updated, and you would always continue playing.

And if you always continue playing, it means you will go beyond $2A. But this we already showed earlier is not a wise choice.

So should you stop at small wins or not?
 
Last edited:
Physics news on Phys.org
If you reach B and continue, then you need to set yourself a stopping point at some point below B. Otherwise you will eventually lose it all. (Of course, if your opponent has a finite amount of money, you break it.)
 
Suppose you are starting with an infinite amount of money, or at least very big. You are betting always 1$ you said.

This is similar to tossing a fair coin: You are getting 1$ for heads -1$ for tails. For a large number of throws the probability settles to 50 % so in the end you are getting: "a" the money you started with plus "x" your wins minus "x" your losses equals "a", nothing.
 
The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
 
Happiness said:
The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
Same with Russian Roulette. Should you continue playing?
 
Rada Demorn said:
Same with Russian Roulette. Should you continue playing?

Sorry I don't get your point.
 
The point is if you keep on playing when you reach a goal, the question becomes who goes broke first, you or the bank?
 
Another thing to consider is whether your opponent also has the option of stopping when he is ahead. He may not allow you to keep playing in the hopes of eventually getting ahead. That allows him to make the game as though you go bankrupt immediately. I think that it forces you to have a policy of quitting immediately when you get ahead. Anything else gives him an advantage. He can play till he gets ahead and quit immediately. Unless you do the same, his expected winnings are positive.
 
Last edited:
What does 'should' mean ? the only reason to gamble with a zero (or less) expected value is entertainment. In an economic sense, the Kelly Criterion determines how much you should bet:
(From the Wikipedia entry)

For simple bets with two outcomes, one involving losing the entire amount bet, and the other involving winning the bet amount multiplied by the payoff odds, the Kelly bet is:

df0ce35820450429d2a76ea826663b84ab62d7aa

where:

  • f * is the fraction of the current bankroll to wager, i.e. how much to bet;
  • b is the net odds received on the wager ("b to 1"); that is, you could win $b (on top of getting back your $1 wagered) for a $1 bet
  • p is the probability of winning;
  • q is the probability of losing, which is 1 − p.
As an example, if a gamble has a 60% chance of winning (p = 0.60, q = 0.40), and the gambler receives 1-to-1 odds on a winning bet (b = 1), then the gambler should bet 20% of the bankroll at each opportunity (f* = 0.20), in order to maximize the long-run growth rate of the bankroll.
 
  • #10
Suppose the bet size is constant. What is the best winning strategy?

If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
 
  • #11
Happiness said:
If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
That is if you are playing against the house that can not end the game. If you are playing against another player who can end the game, that is not true.
 
  • #12
The Kelley Criterion does not guarantee you won’t go broke. There is always a minimum bet, either because it is set by the house or that money is not infinitely divisible. In the example above betting F* fraction of your bankroll with a minimum bet m you will go broke with number of consecutive losses n

(1-F*)^n * bankroll < m.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 75 ·
3
Replies
75
Views
8K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 76 ·
3
Replies
76
Views
6K
  • · Replies 53 ·
2
Replies
53
Views
8K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 53 ·
2
Replies
53
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K