Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Help coming up with the right algebra formula to use.

  1. Jun 17, 2008 #1
    So, this may be the simplest question ever asked here, but my brain fails massively and I need help. Here's the situation, any help would be greatly appreciated.

    There's a slot machine which randomly charges between $0.01 and $1.00 to play, and will either pay out $1 or nothing. Let's call the price to play X. X is completely random.

    The likelihood that the slot machine will pay out $1 on a particular turn can be called Y. Let's assume that one can know Y. The odds for any particular turn are completely random.

    So without knowing Y, to make money one would just play the turns priced lower than $0.50, and over time you'd make money.

    But knowing Y, you should be able to make bets even on turns that cost close to $1 if you choose only to play those with very favorable odds.

    But what is the best way to account for both the potential gain and the risk to come up with the "winning formula"?

    Thanks,
    Jacob
     
    Last edited: Jun 17, 2008
  2. jcsd
  3. Jun 17, 2008 #2
    Could it be as simple as:
    (100/X)*Y>1 means bet
    (100/X)*Y<1 means don't bet
    ?
     
  4. Jun 17, 2008 #3

    CRGreathouse

    User Avatar
    Science Advisor
    Homework Helper

    With odds Y%, you bet when Y > X (X in cents). At least, assuming you're risk-neutral.
     
  5. Jun 17, 2008 #4

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    What do you mean that the odds Y are completely random? Or for that matter that the price is completely random? Unless you put some probabilty distribution on there then you have no way of knowing what to do. And do you have to agree to play *before* the price is revealed, or do you choose to play or not play after the price is revealed.

    Choosing to play the games where the price is less than fifty cents doesn't guarantee you make a winning in the long run: what if the odds of getting 0 in return were 1 in 1? Your 'rule' there tacitly assumes that the game pays out 50% of the time, or more accurately that the expected pay out is 50 cents. That is a dangerous assumption. If you don't know the odds of winning then you should *never* play, to be absolutely safe.
     
    Last edited: Jun 17, 2008
  6. Jun 17, 2008 #5
    Guess I explained it poorly at the beginning. Lemme give it another shot, using a different situation that may be more clear.


    There are many slot machine which all pay out either $1 or nothing.

    Before each turn, all of the machines display a price-to-play between $0.01 and $1.00. The price is X. Each machine displays different prices at random, but that's unimportant since the price is always known before deciding to play.

    What differentiates the machines is that each machine has a different % likelihood that it will pay-out, the % likelihood of a pay-out for a machine is Y.


    So what I'm looking for is a formula that will weigh the possibly gain against the risk of actually making it to give some result that will, if followed out enough times to be statistically significant, yield profit. Basically a number that I know "if it's below this, don't bet, if it's above this, bet."

    Thanks alot, hope this is more clear,
    Jacob

    P.S. Would it be as simple only betting when the value of Y/X is one?
     
    Last edited: Jun 17, 2008
  7. Jun 17, 2008 #6

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    Can I write it as I see it? You are given a slot machine, it will cost you X dollars to play (X is somewhere between 0.01 and 1). Given that I know what X is should I play if the expected pay out on that machine is Y dollars? Yes if Y is more than X no otherwise.
     
  8. Jun 18, 2008 #7
    Y doesn't represent the payout, the payout will only ever be $1 or nothing. Y is the likelihood that you will actually make the $1.
     
  9. Jun 19, 2008 #8
    As long as the price to play is lower than the chance of winning, then you should bet.

    For example, if there is a $.50 game with a 51% chance of winning, then if you played it 1000 times you would most likely have won about $510 and used $500 for play.

    So an algebraic way of stating this would be "Play if Y > X", where Y is the cost to play in cents and X is the likelihood to win $1.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Help coming up with the right algebra formula to use.
Loading...