One 50% bet is worse than fifty 1% bets?

  • Thread starter Thread starter iDimension
  • Start date Start date
Click For Summary
The discussion centers on two betting strategies with a £500 stake: a single 50% bet versus fifty 1% bets. The second option, spreading bets, is favored for providing multiple chances to win, despite both options yielding a 50% chance of winning overall. Psychological factors play a role, as many prefer the allure of a big win from a single bet rather than the slower, smaller wins from multiple bets. However, critics argue that the 1% bets increase the likelihood of losing everything, while the 50% bet allows for a quicker resolution. Ultimately, the conversation highlights the tension between risk management and the desire for immediate gratification in gambling strategies.
  • #31
mfb said:
The point that was unclear in the last few posts is the amount of money you get if you win - the $1000.
Yes, but you have a higher probability to not win, as shown before.

Yeh I was just clearing up any confusion caused by my previous posts. So if you ran a computer simulation 10million times. the 1% bets would end up about 40% win and 60% lose whereas the 50/50 bets would end up at win 50% and lose 50% ?
 
Mathematics news on Phys.org
  • #32
The 1% bets (assuming we play all 50 and have a chance to win $1000 each time) would end up:
- 60.5% to not win
- 30.6% win once
- 7.6% win twice
- 1.2% win three times
- 0.1% win four times
negligible chance to win more often

Your expectation value is then given by
-500 + 1000*(0.605*0 + 0.306*1 + 0.076*2 + 0.012*3 + 0.001*4) = 0 (the numbers here would give -2 but that is a rounding error)For the 50%-bet, there are just two options:
50% chance to not win
50% chance to win
Expectation value:
-500 + 1000*(0.5*0 + 0.5*1) = 0If you play with the 1%-bets and stop as soon as you win:
1% chance to win with the first attempt
0.99*1% chance to win with the second attempt (0.99 accounts for the 1% probability that we stopped before)
0.99*.99*1% chance to win with the third attempt
...
0.9949*1% chance to win with the last attempt
0.9950=60.5% chance to not win at all
The expectation value is zero again.
0.01*990 + 0.99*0.01*980 + ... + 0.9949*500 + 0.9950*(-500) = 0
 
  • Like
Likes iDimension
  • #33
OK so just one last question.

If 50% bets are better than 1% bets. Is 90% bets better than 50% bets?

Betting 90% ($900) to win $100 each time.
 
  • #34
iDimension said:
OK so just one last question.
If 50% bets are better than 1% bets. Is 90% bets better than 50% bets?
Betting 90% ($900) to win $100 each time.
A 90% bet has a 90% chance to win. Winning yields a net profit of 10% of the pot.
A 90% bet has a 10% chance to lose. Losing yields a net loss of the players outlay, i.e. 90% of the pot.

The expected value is 0.90 * $100 - 0.10 * $900 = $0

No matter how much or how little you bet, your expected net on a single play is $0.00. No matter what strategy or pattern of bets that are employed, the expected net on any finite sequence of plays is also $0.00.
 
  • #35
What do you mean expected net? Do you mean that when it all balances out over a large number of games, the money you spent might be $20million and the amount you win is $10million and the amount you lose is $10million meaning you've made 0% profit?
 
  • #36
iDimension said:
What do you mean expected net? Do you mean that when it all balances out over a large number of games, the money you spent might be $20million and the amount you win is $10million and the amount you lose is $10million meaning you've made 0% profit?
By expected net, I mean the "expected value" of your net winnings -- (i.e. winnings minus losses). "Expected value" is a term used in probability and statistics when you have a probability distribution for a set of numeric outcomes. You multiply the probability of each outcome by the numeric value of that outcome and add up those products. The total is the expected value. It is also called the mean of the distribution.

Yes, over a sufficiently large number of bets, it is overwhelmingly likely that the total earnings will be approximately equal to the expected value per bet multiplied by the number of bets. This is called the "law of large numbers".
 
  • #37
iDimension said:
OK so just one last question.

If 50% bets are better than 1% bets. Is 90% bets better than 50% bets?

Betting 90% ($900) to win $100 each time.

You have to define what "better" means.
 
  • #38
Hornbein said:
You have to define what "better" means to you.
 
  • #39
iDimension said:
If 50% bets are better than 1% bets.
It is not, and I'm getting tired of repeating this.
 
  • #40
I'm confused about the initial set-up, and I think other people may be too. The one bet of $500, paying $500 50% of the time is clear enough. Net expectation $250, right?
It's the 50 small bets of $10. What are the odds there? 100:1 or 1:1? I think people are confusing the fraction of the pot that you're betting, with the odds on the small bets, which aren't stated explicitly. Either case is easy to analyse:
If the $10 bet is 100:1 against, each bet has a net expectation of 10 cents. Fifty of these makes a grand total expectation of $5.00. Not so good.
If the $10 bets has 1:1 odds, paying $10, the expectation of each bet is $5. Fifty of these makes your net expectation $250 - same as with the single big bet, as it should be with 50:50 bets. BUT WITH THIS DIFFERENCE: with the 50 small bets you are virtually guaranteed of getting a significant amount of money! $250 on the average, but the chance of getting $0 or $500 is about a quadrillion to one against. So you can take a guaranteed (almost) win of a few hundred, or take a 50-50 chance of $500. This will depend on your own situation. Me, I'd take the found money.
 
  • #41
Based on what other people have said, it looks like your expected value numbers are wrong.

If you put $500 into the pot, you have a 50% of winning $500 and a 50% chance of losing $500 giving an expected value of $0.00, not $250.00

If you put $10 into the pot, you have a 1% chance of winning $990 and a 99% chance of losing $10. Again, you can multiply this out and get an expected value of $0.00

In the long run, neither strategy is dominant. Both strategies lead to what everybody was thinking from the beginning- you end up with however much money you started with.

Your chance of winning is equal to x/1000 where x is how much money you put in.

How much money you win is equal to 1000 - x

So positive expected value is always (x/1000) * (1000 - x) which is (x - x^2/1000)

But you also have a chance to lose all the money you put in the pot. This is equal to (1 - x)/1000 and you lose x. This is expected value of (-x + x^2)/1000

Total expected value is the two added together: (x - x^2/1000) + (-x + x^2)/1000

No matter how much money you put into the pot and no matter how many bets you make, the expected value is always 0.

Here is a way to help understand this.

Imagine there are 10 people with $100. They put all their money into a pot and one guy leaves with $1000

The same 10 people decide to put another $100 into the pot. Another person leaves with $1000. People have spent $2000 and $2000 has left the pot.

If this process continues forever, it's obvious that however much money goes into the pot is how much money leaves the pot.

Imagine that this process is repeated 10 times. All 10 people have won the pot once and have $1000. But they realize that that is exactly how much money all of them have put into the pot- 10 bets of $100 10 times, or $10,000.

There is no way anybody comes out of this system with any sort of advantage, and there is no way more money will ever leave the pot than what went into the pot.

In order to believe that there is a dominant strategy, you have to believe that money comes from nowhere. The pot does not make any money, it only redistributes the money that people put in in an even manner.
 
  • #42
xman720 said:
...

Yeah xman I agree with this and I can see why this is now the case. But what other people have said here is that with the 1% bets, your chances of losing is greater than your chances of winning. Thus if you played a large number of games, your wins would be less than your loses which is clearly worse than 50/50 bets.

I don't know why some people are struggling to understand what I'm saying, maybe I'm not great at explaining things. The key factor here is we don't have infinite money. You have just $500... when it's gone it's gone that's it.

Let's just use a $100 pot.

Method One: You put your full $50 into the pot giving you 1(50%) chance to win the other $50. Whether you win or lose you only ever play one game.

Method Two: You split your $50 into $1 bets.
Game 1: You bet $1 giving you a 1% chance to win the $100 pot. If you win you walk away $149 and never play again. If you lose, you play again.
Game 2: You bet another $1 giving you a 1% chance to win $99. If you win you walk away with $148 and never play again. If you lose, you play again.
...
...
...
Game 50: You finally win with your last dollar. You walk away with the $100 pot bringing you to $100 which is the same amount that Method One could have won but you could have won more money with method two making it the "better" option.

But then I see people here saying that the 1% bets actually give you a 60% chance to lose and a 40% chance to win which is why I thought that the 1(50%) bet is the "better" option
 
Last edited:
  • #43
Apologies, but I don't understand the original question. (I have taught probability at Stanford, so the theory is no problem.)

What is the role of the "other people" who "make up the other £500 ?

And where it says "The second option is to spread your bets, and instead you decide to make fifty £10 bets giving you a 1% chance to win, but you get fifty tries," the fifty £10 bets give you a 1% chance to win what, exactly?

And are you saying the 50 bets each give you a 1% chance of winning [whatever], or do they give you a 1% chance of winning [whatever] when taken together?

Is there any chance that someone can express the question clearly and unambiguously? In my fairly extensive experience, probability questions are often asked unclearly and ambiguously, and that is the main thing that makes them hard.
 
  • #44
iDimension said:
Game 50: You finally win with your last dollar. You walk away with the $100 pot bringing you to $100 which is the same amount that Method One could have won but you could have won more money with method two making it the "better" option.

But then I see people here saying that the 1% bets actually give you a 60% chance to lose and a 40% chance to win which is why I thought that the 1(50%) bet is the "better" option
Okay, one last time, and I think this thread went in circles for way too long so I'm out after this post: those two effects exactly cancel each other. Both options have exactly the same expectation value of zero.
 
  • #45
I think I can help clear up the confusion. The game is defined as follows:

On each play the gambler can choose to make exactly 1 bet from a choice of 2:
  • Bet50: pay $50 for a 50% chance of a payout of $100
  • Bet1: pay £1 for a 1% chance of a payout of $100
The Expected Values for each bet are easily calculated as zero (remembering to subtract the cost of the bet):
  • E(Bet50) = $100 x 0.5 - $50 = $50 - $50 = $0
  • E(Bet1) = $100 x 0.01 - $1 = $1 - $1 = $0
The outcome of Method One is clear: you have a 50% chance of walking away with $100 and a 50% chance of walking away with nothing.

The outcome of Method Two needs a bit of work:
  • 1% chance of winning the first play and walking away with $149
  • 99% x 1% chance of losing the first play, winning the second and walking away with $148
  • 99% x 99% x 1% chance of losing the first two plays...
  • ...
  • 99%49 x 1% chance of losing 49 plays, winning the 50th and walking away with $100
  • 99%50 ≈ 60.5% chance of losing 50 plays and walking away with nothing
So although there is a chance with Method Two to walk away with up to $49 more than with Method One, you are more likely to walk away with nothing.

There is also Method Three where you play 50 times whether you win or not. The chances of losing 50 times and walking away with nothing are again approximately 60.5%, but now you have a chance to win even more - you could even walk away with $5,000! (compare the probability of this with the number of sub-atomic particles in the observable universe).
 
  • Like
Likes iDimension
  • #46
MrAnchovy said:
The outcome of Method One is clear: you have a 50% chance of walking away with $100 and a 50% chance of walking away with nothing.
  • 99%50 ≈ 60.5% chance of losing 50 plays and walking away with nothing
So although there is a chance with Method Two to walk away with up to $49 more than with Method One, you are more likely to walk away with nothing.

Method One: Probability to win 0 = 50%
Method Two: Probability to win 0 = 60.5%

So betting once for 50% is the better option. Which is basically what I've been asking this entire thread. Apologies if my explanation of the game was terrible.

Thanks to all who posted and had the patience to help me lol.
 
  • #47
iDimension said:
Method One: Probability to win 0 = 50%
Method Two: Probability to win 0 = 60.5%

So betting once for 50% is the better option.
The probability to lose everything is not the only possible measure of success. If you measure success by expected value, all options are equally poor.
 
  • #48
iDimension said:
So betting once for 50% is the better option.
I didn't say that. Consider this:

You believe that Method One is the better option, but there is also another gambler Two who chooses Method Two. After playing your games you compare notes.
  • There is a 30.25% chance you both end up with $0
  • There is a 0.31% chance you both end up with exactly $100
  • There is a 30.25% chance you end up with $100 and Two leaves with nothing
  • There is a 19.60% chance that you end up with $0 and Two ends up with more than $100
  • There is a 19.60% chance that you end up with $100 and Two ends up with more than $100
So the approximate chances are 31% that you both walk away with the same, 30% you walk away with more than Two and 39% the player of Method Two walks away with more than the player of Method One. Which do you think is the best method now?
 
  • #49
"What kind of crazy maths is this?" you might ask, "I ask a simple question and get three different answers!"

This is the maths of decision theory. It doesn't have a lot of significance for the natural sciences, but in the field of economics it is very important. Key to decision theory is the concept of a utility function: this is a rigorous way of evaluating different outcomes to decide what is "best". The reason we have three different answers in the game posed by the OP is that we have used three different utility functions:
  • maximising the expected value of the winnings (note that this is rarely used as a utility function because expected values depend on a large number of trials, in decision theory we generally only get to make a decision once) which is $0 for both Methods and so they are equal
  • maximising the probability of (winnings > $0) which is higher for Method One
  • minimising the probability of (regret > $0) which is higher for Method Two (regret is an important concept within decision theory)
 
  • #50
I still think the question is a bit vague.

Bet #1. $500. Odds 50%. Win pays $500 (plus the original $500, net $1000)
Bet #2 $10. Odds 1%. Win pays $990 (plus the original $10, net $1000)

The choice is between making 1 of Bet #1 or 50 of Bet #2? For convenience assume they are not 50 sequential bets. Say you can buy 1 of type Bet #1 or 50 of type Bet #2. There is no time difference or difficulty difference ... it is purely a matter of preference.

I see 50 of Bet #2 as better. You have a chance of winning 50-times. The downside is the same: you lose all your money. The upside on 50 bets is higher.

A lot of people point out that the game of chance proposed is unrealistic. I agree. In a casino, you always have worse odds than the payout. But if it was just a matter of putting the same $500 down, either into a box marked Bet#1, or a box marked Bet#2, and then getting the outcome, I would choose the box marked Bet#2.

I've only casino gambled once, and my goal was to play the game with the fairest odds (craps), and to make the smallest bets. Doing that, some people lose and some people win. I won a small amount. But the house ALWAYS wins in the long run.

Perhaps I misunderstand the problem, but the time element seemed to be a side issue that should be subtracted out. I thought the problem was not how to sequentially gamble, but whether to take 50 betting positions vs 1 betting position, with the odds and payout as described. I think the 50 betting positions is superior.
 
  • #51
Taking the game of roulette as an example, should one bet a large amount on black or a small about on 23, many times? The argument goes that one will miss 23 60% of the time, while 40% of the time one will hit 23 one or more times.

I think one loses 2.5% to the house in roulette, so you would lose something like 63% of the time with that strategy, almost 2/3rds. But when you won, you would win a lot back, thanks to multiple hits on some occasions, and the average over time would be the -2.5%.

Betting on black, you'd win ~49% of the time (because it lands on 0 sometimes).
 
  • #52
verty said:
Taking the game of roulette as an example, should one bet a large amount on black or a small about on 23, many times? The argument goes that one will miss 23 60% of the time, while 40% of the time one will hit 23 one or more times.

I think one loses 2.5% to the house in roulette, so you would lose something like 63% of the time with that strategy, almost 2/3rds. But when you won, you would win a lot back, thanks to multiple hits on some occasions, and the average over time would be the -2.5%.

Betting on black, you'd win ~49% of the time (because it lands on 0 sometimes).
Using your roulette example, Say the numbers 1-50 are black and the numbers 51-100 are red. I was unsure if the problem as being phrased whether it was better to bet black/red, or to take 50 individual numbers (with one spin). There is no difference there. If there were 51 tables and you could bet the number 1 on 50 tables, or black/red on the 51st, I would take the 50 bets on #1.

My reasoning is that the odds of a single win SEEM the same, and then there is a slight chance of hitting twice. Or even the #1 coming up on all 50 tables.

When the odds are slightly against you, the best strategy is to bet small amounts and to have a loss limit. Anyone can have a losing streak or a winning streak. Just don't let a losing streak take you to the poorhouse.
 
  • #53
votingmachine said:
When the odds are slightly against you, the best strategy is to bet small amounts and to have a loss limit. Anyone can have a losing streak or a winning streak. Just don't let a losing streak take you to the poorhouse.
If the odds are against you then a strategy of making a single minimum bet and then walking away is indeed near optimal. However, a strategy that reverses the order of those two steps is even better.
 
  • Like
Likes mfb
  • #54
jbriggs444 said:
If the odds are against you then a strategy of making a single minimum bet and then walking away is indeed near optimal. However, a strategy that reverses the order of those two steps is even better.
Again, that is an all-or-nothing strategy. If you play a game with odds near to a coin flip, you are better off making many small bets. The odds of an outcome far removed from the middle gets small.

This is diverging into secondary motivations. If you enjoy gambling, then gamble. If you don't, then don't. I agree that gambling is not all that fun. I've been in Las Vegas for a conference, and I played a small amount of craps. I would not go out of my way to gamble. If you are in Vegas, the best return is to play very small bets, and get a few free drinks (if you like what they have ... I got a free diet coke, and a free coffee). If you don't hit a losing streak, but have mixed results, you walk away about even. And if you like that, then there you go.

I was trying to separate the secondary from the original question, by phrasing it as $500 that goes into a single betting set-up, and the results take the same amount of time. That removes he questions of whether you should just walk away with the money you have in your wallet intact ... which is what I would do with $500. But if you are obliged to choose, then pick the best option. I think it is the 50 bets.

In a REAL casino, there is no betting strategy that assures you of a win. If your loss limit is $500, and you enjoy the process, then spread the bets out.

But in the synthetic question, there is no enjoyment difference to consider. Just the choice.

In a real casino, you can make a different choice also. Instead of 50 bets at 1%, you can make 50 bets at 49%. The expected outcome for that is a small loss. In a real casino, if you have a $500 loss limit, make many small bets (at the house minimum), and hope for a winning streak. But again, that was not the question, or the assumptions.
 
  • #55
votingmachine said:
Again, that is an all-or-nothing strategy. If you play a game with odds near to a coin flip, you are better off making many small bets. The odds of an outcome far removed from the middle gets small.
A single minimum bet nearly minimizes variance. Playing additional games increases variance. The variance of a sum is the sum of the variances. You minimize that by minimizing the number of [minimum] bets. Betting zero times is optimal if your goal is to minimize variance.
 
  • #56
There is a large amount of mathematics on this problem. Here is a nice reference which shows that the basic problem still remains open. There are some nice theorems saying that bold play is optimal but they don't cover all interesting cases. https://www.researchgate.net/publication/266704722_The_re-opening_of_Dubins_and_Savage_casino_in_the_era_of_diversification
 
Last edited by a moderator:
  • #57
with an expected value >0 the Kelly Criterion describes the optimal strategy, on each bet where the payoff is 1-1, the proportion of your capital that should be wagered is 2p-1, where 0.50<p<1 is the odds of winning

https://en.wikipedia.org/wiki/Kelly_criterion
 
  • #58
Why are we introducing more variables like psychology of the better, casino environment, roulette etc?

None of this exists in my game, it's purely numbers. The goal of the game is to maximise your chances of NOT walking away with $0 after your $500 has been bet so the question is simple. Which option gives the better chances?

1 bet of $500
50 games of $10 (played individually, stopping after you win once)
 
  • #59
iDimension said:
Why are we introducing more variables like psychology of the better, casino environment, roulette etc?

None of this exists in my game, it's purely numbers. The goal of the game is to maximise your chances of NOT walking away with $0 after your $500 has been bet so the question is simple. Which option gives the better chances?

1 bet of $500
50 games of $10 (played individually, stopping after you win once)

Suppose it was a claw game, 1 in 2 or 1 in 100. The 2nd strategy would be better because you would save some money if you win the prize early.
 
  • #60
verty said:
Suppose it was a claw game, 1 in 2 or 1 in 100. The 2nd strategy would be better because you would save some money if you win the prize early.
Note that this time around the goal has been clearly stated: maximize the probability of not losing everything. With that spelled out, the first strategy is clearly superior. The result is probability 0.5 of not losing everything versus a probability of 0.395 with the second method.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 53 ·
2
Replies
53
Views
8K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
9K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K