Help coming up with the right algebra formula to use.

  • Thread starter wasteofo2
  • Start date
  • #1
466
1

Main Question or Discussion Point

So, this may be the simplest question ever asked here, but my brain fails massively and I need help. Here's the situation, any help would be greatly appreciated.

There's a slot machine which randomly charges between $0.01 and $1.00 to play, and will either pay out $1 or nothing. Let's call the price to play X. X is completely random.

The likelihood that the slot machine will pay out $1 on a particular turn can be called Y. Let's assume that one can know Y. The odds for any particular turn are completely random.

So without knowing Y, to make money one would just play the turns priced lower than $0.50, and over time you'd make money.

But knowing Y, you should be able to make bets even on turns that cost close to $1 if you choose only to play those with very favorable odds.

But what is the best way to account for both the potential gain and the risk to come up with the "winning formula"?

Thanks,
Jacob
 
Last edited:

Answers and Replies

  • #2
466
1
Could it be as simple as:
(100/X)*Y>1 means bet
(100/X)*Y<1 means don't bet
?
 
  • #3
CRGreathouse
Science Advisor
Homework Helper
2,820
0
With odds Y%, you bet when Y > X (X in cents). At least, assuming you're risk-neutral.
 
  • #4
matt grime
Science Advisor
Homework Helper
9,395
3
What do you mean that the odds Y are completely random? Or for that matter that the price is completely random? Unless you put some probabilty distribution on there then you have no way of knowing what to do. And do you have to agree to play *before* the price is revealed, or do you choose to play or not play after the price is revealed.

Choosing to play the games where the price is less than fifty cents doesn't guarantee you make a winning in the long run: what if the odds of getting 0 in return were 1 in 1? Your 'rule' there tacitly assumes that the game pays out 50% of the time, or more accurately that the expected pay out is 50 cents. That is a dangerous assumption. If you don't know the odds of winning then you should *never* play, to be absolutely safe.
 
Last edited:
  • #5
466
1
What do you mean that the odds Y are completely random? Or for that matter that the price is completely random? Unless you put some probabilty distribution on there then you have no way of knowing what to do. And do you have to agree to play *before* the price is revealed, or do you choose to play or not play after the price is revealed.

Choosing to play the games where the price is less than fifty cents doesn't guarantee you make a winning in the long run: what if the odds of getting 0 in return were 1 in 1? Your 'rule' there tacitly assumes that the game pays out 50% of the time, or more accurately that the expected pay out is 50 cents. That is a dangerous assumption. If you don't know the odds of winning then you should *never* play, to be absolutely safe.
Guess I explained it poorly at the beginning. Lemme give it another shot, using a different situation that may be more clear.


There are many slot machine which all pay out either $1 or nothing.

Before each turn, all of the machines display a price-to-play between $0.01 and $1.00. The price is X. Each machine displays different prices at random, but that's unimportant since the price is always known before deciding to play.

What differentiates the machines is that each machine has a different % likelihood that it will pay-out, the % likelihood of a pay-out for a machine is Y.


So what I'm looking for is a formula that will weigh the possibly gain against the risk of actually making it to give some result that will, if followed out enough times to be statistically significant, yield profit. Basically a number that I know "if it's below this, don't bet, if it's above this, bet."

Thanks alot, hope this is more clear,
Jacob

P.S. Would it be as simple only betting when the value of Y/X is one?
 
Last edited:
  • #6
matt grime
Science Advisor
Homework Helper
9,395
3
Can I write it as I see it? You are given a slot machine, it will cost you X dollars to play (X is somewhere between 0.01 and 1). Given that I know what X is should I play if the expected pay out on that machine is Y dollars? Yes if Y is more than X no otherwise.
 
  • #7
466
1
Can I write it as I see it? You are given a slot machine, it will cost you X dollars to play (X is somewhere between 0.01 and 1). Given that I know what X is should I play if the expected pay out on that machine is Y dollars? Yes if Y is more than X no otherwise.
Y doesn't represent the payout, the payout will only ever be $1 or nothing. Y is the likelihood that you will actually make the $1.
 
  • #8
250
0
As long as the price to play is lower than the chance of winning, then you should bet.

For example, if there is a $.50 game with a 51% chance of winning, then if you played it 1000 times you would most likely have won about $510 and used $500 for play.

So an algebraic way of stating this would be "Play if Y > X", where Y is the cost to play in cents and X is the likelihood to win $1.
 

Related Threads for: Help coming up with the right algebra formula to use.

  • Last Post
Replies
6
Views
2K
Replies
3
Views
1K
  • Last Post
Replies
1
Views
1K
Replies
4
Views
2K
Replies
11
Views
1K
  • Last Post
Replies
1
Views
3K
  • Last Post
Replies
2
Views
1K
Replies
5
Views
787
Top