I Gambler's dilemma: should you stop at small wins?

  • Thread starter Happiness
  • Start date
536
19
Suppose the chance of winning and the chance of losing a game are both 0.5. You have $A at first, and the bet per game is $1. You stop playing when either you lose all your money (bad outcome) or when you reach $B, where B>A (good outcome). Then the chance of having the good outcome is ##\frac{A}{B}##.

$B is the amount of money you aim to get before you stop playing. Clearly, it doesn't make sense to have an aim of more than $2A because then it's more likely that you will lose all your money instead. So the best strategy is to stop at small wins.

However, once you reach $B, the probabilities get updated, just like the probability of getting another head from a coin toss is still 0.5 after getting 9 heads in a row. So you continue playing, telling yourself you would stop at small wins. But every time you reach your goal, the probabilities get updated, and you would always continue playing.

And if you always continue playing, it means you will go beyond $2A. But this we already showed earlier is not a wise choice.

So should you stop at small wins or not?
 
Last edited:

mathman

Science Advisor
7,596
360
If you reach B and continue, then you need to set yourself a stopping point at some point below B. Otherwise you will eventually lose it all. (Of course, if your opponent has a finite amount of money, you break it.)
 
Suppose you are starting with an infinite amount of money, or at least very big. You are betting always 1$ you said.

This is similar to tossing a fair coin: You are getting 1$ for heads -1$ for tails. For a large number of throws the probability settles to 50 % so in the end you are getting: "a" the money you started with plus "x" your wins minus "x" your losses equals "a", nothing.
 
536
19
The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
 
The point that is confusing is that at the beginning, it is not favourable to have an aim of more than twice our starting amount, but once we get to our aim, it seems that we can continue playing and hence exceed twice our starting amount.

The chance of winning is not important. Suppose we change the chance of winning to 0.6 and the chance of losing to 0.4. Then we would have a different goal. But once we reach our goal, should we continue playing?
Same with Russian Roulette. Should you continue playing?
 

mathman

Science Advisor
7,596
360
The point is if you keep on playing when you reach a goal, the question becomes who goes broke first, you or the bank?
 

FactChecker

Science Advisor
Gold Member
2018 Award
4,661
1,588
Another thing to consider is whether your opponent also has the option of stopping when he is ahead. He may not allow you to keep playing in the hopes of eventually getting ahead. That allows him to make the game as though you go bankrupt immediately. I think that it forces you to have a policy of quitting immediately when you get ahead. Anything else gives him an advantage. He can play till he gets ahead and quit immediately. Unless you do the same, his expected winnings are positive.
 
Last edited:

BWV

375
279
What does 'should' mean ? the only reason to gamble with a zero (or less) expected value is entertainment. In an economic sense, the Kelly Criterion determines how much you should bet:
(From the Wikipedia entry)

For simple bets with two outcomes, one involving losing the entire amount bet, and the other involving winning the bet amount multiplied by the payoff odds, the Kelly bet is:


where:

  • f * is the fraction of the current bankroll to wager, i.e. how much to bet;
  • b is the net odds received on the wager ("b to 1"); that is, you could win $b (on top of getting back your $1 wagered) for a $1 bet
  • p is the probability of winning;
  • q is the probability of losing, which is 1 − p.
As an example, if a gamble has a 60% chance of winning (p = 0.60, q = 0.40), and the gambler receives 1-to-1 odds on a winning bet (b = 1), then the gambler should bet 20% of the bankroll at each opportunity (f* = 0.20), in order to maximize the long-run growth rate of the bankroll.
 
536
19
Suppose the bet size is constant. What is the best winning strategy?

If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
 

FactChecker

Science Advisor
Gold Member
2018 Award
4,661
1,588
If your best size is always 20% of the bankroll, then you would never go broke, and the game would never end.
That is if you are playing against the house that can not end the game. If you are playing against another player who can end the game, that is not true.
 

BWV

375
279
The Kelley Criterion does not guarantee you won’t go broke. There is always a minimum bet, either because it is set by the house or that money is not infinitely divisible. In the example above betting F* fraction of your bankroll with a minimum bet m you will go broke with number of consecutive losses n

(1-F*)^n * bankroll < m.
 

Want to reply to this thread?

"Gambler's dilemma: should you stop at small wins?" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top