Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Simple (but interesting) gambling problem

  1. Jul 21, 2006 #1
    I thought about this outta no where. I want to make a casino game where the odds are against the house, but the catch is that the game rules force the player to play until the odds change against him.

    For example, make the odds 80% against the house (player has 80% chance of winning). If the player puts down a dollar at first and each time he wins, the casino pays him another dollar (doesn't double his money). He is forced to play a certain number of times, but if he loses he loses his initial dollar.

    Here's how I thought about it (from the player's POV):

    amount of money won = number of times the game was played = X
    probability of winning each time = P
    amount lost = $1

    Number of times the game has to be played in order to break even:

    (amount of money won)*(probability of winning) - (amount lost)*(probability of losing) = 0

    X*P^X - ($1)*(1- P^X) = 0

    which simplifies to: (P^X)*(X+1) - 1 = 0

    All the casino has to do to increase the odds in its favor is to force the player to play at least 1 more time than X (which depends on P). Anyone see any problems with my logic or math?

    PS. I initially thought this problem would be a lot simpler than this :smile:.
  2. jcsd
  3. Jul 21, 2006 #2


    User Avatar
    Science Advisor
    Homework Helper

    The net value of the game is still negative for the player.

    There are some casino games that can have odds that *slightly* favor the player, like progressive slot machines under the right circumstances.
  4. Jul 21, 2006 #3
    I don't think I made myself clear (sorry about that). The game is supposed to be so that the final edge is against the player. However, I wanted to give the player a chance of more than 50% of making money every time he plays (ie. P>.5).
  5. Jul 21, 2006 #4
    To make this clearer:

    I was originally thinking about a game that looks like it favors the player instead of the house when it actually doesn't. So I thought I would give the player an 80% chance of winning but force him to play 4 times, because then his probability of winning would be 0.8^4 or about .41 (less than .5).

    This got complicated when I figured out that the house has to pay more when it loses than when it wins. The above rules would actually still be against the house. I then tried to figure out how many times the rules need to force the player to play in order for the game to favor the house.
  6. Jul 21, 2006 #5


    User Avatar
    Science Advisor
    Homework Helper

    That's easy:
    If the chance that the player wins one round is then the chance that the player wins n consecutive rounds is [itex]p^n[/itex]. Assuming that the game starts with a wager of 1, and that the player must win [itex]n[/itex] consecutive rounds to win [itex]n+1[/itex] dollars, we have the value of the game as:
    for the house to have a edge. Solving for [itex]p[/itex] we get:

    [tex]n=1 \rightarrow p<\frac{1}{2}[/tex]
    [tex]n=2 \rightarrow p<\frac{1}{\sqrt{3}}[/tex]
    [tex]n=3 \rightarrow p<\frac{1}{\sqrt[3]{4}}[/tex]
    and so on. It should be easy to see that even the [itex]n=2[/itex] case can have probabilities of winning a round that are greater than .5.

    Notably, since the player is wagering his stack (more than 1) the later rounds are clearly not in favor of the player.
    Last edited: Jul 21, 2006
  7. Jul 21, 2006 #6
    That's very interesting. You solved for the probability given the number of times instead of number of times given the probability (I can't think of a simple way off the top of my head to solve for n...calculators can always do it though). BTW, I have the the same formula, but your logic in finding the formula was simpler than mine. Thanks! :biggrin:
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook