# Probability problem - from another forum

1. Mar 29, 2014

### Stephen Tashi

This interesting problem was posted on another math forum (http://www.sosmath.com/CBB/viewtopic.php?f=6&t=62620):

The probability that a six sided (possibly loaded) die lands on the ith face is $P_i$. The $P_i$ are unknown. We are given the values of $S_i$ = the probability that the $i$_th face is first to land 100 times in a sequence of independent tosses of the die. Find the $P_i$ as functions of the $S_i$.

2. Mar 29, 2014

### Staff: Mentor

Although you cannot bias a coin, let's start with just heads and tails:

For each k from 100 to 199, we can calculate the probability to get 100 times heads and the probability to get 100 times tails. The resulting expression might look ugly, but not hard to set up.

Now we can add a third side to the coin ("corner"). For each k from 100 to 198, we can calculate the probability to get 100 corners - and for k where (heads+tails) exceeds 100, we know the probability that we stop from the previous step.

I think in this way it is possible to reduce the original 100^6 problem to five problems with ~100-500 cases with explicit formulas for each case.

3. Mar 30, 2014

### willem2

I get this for the probability that you get 100 heads first.

$$\sum_{k=0}^{99} \frac {(99+k)!} {99! k!} {P_h}^{100}{P_t}^k$$

the k'th term is the probability of first getting 99 heads and k tails in any order, and then a single head.

I think you meant for k from 100 to 298?
I don't think it's that easy. I think you get a double summation, where you have to sum over the total number of throws as well as over the number of heads or tails.

Last edited: Mar 30, 2014
4. Mar 30, 2014

### Staff: Mentor

Oh typo, should be 298, yes.
Yeah, you get a sum with the previous sums as summands. So what? Still something you can write down, or calculate.

5. Mar 31, 2014

### bpet

Using the Negative Binomial CDF this can be expressed in terms of the regularised incomplete Beta function

$$1-I_{P_t}(100,100)=I_{P_h}(100,100)$$

6. Mar 31, 2014

### Stephen Tashi

Some intuitive musings:

Suppose I'm required to bet on face 1 "winning" in game of independent random tosses, but I have a choice about what what game is played. I can pick from games like "best 1 out of 1", "best out of 7", "first to occur 5 times in a row", "first to happen 100 times", etc.

My intuition is that if I have the advantage that the probability p[1] of face 1 coming up is greater than the other p then I have a better chance of winning in the games that can involve more throws than the "1 out of 1" toss game.

My intuition says that my chances of winning any of those games are greater if the probability of the other faces 1 - p[1] is divided equally among the the other faces (maximizing their competition with each other, so to speak).

One could define "perverse" games where the player with advantage in "1 out of 1" game has a disadvantage in games of longer length. However, in natural examples ( best n out of m, first n in a row, first to occur n) I think the longer length of the game amplifies the advantage a player holds in a "1 out of 1' game. I wonder if there is general mathematical argument that this is true.

7. Mar 31, 2014

### Staff: Mentor

Stephen Tashi: Sure, that should be clear as for longer games the relative standard deviation goes down.

I don't think you can go above 50%, but certainly above 1/6:

Let p1=0.4, p2=0.6, pi=0 otherwise, you need 1s to win.
"1 out of 1" has a winning probability of 0.4, "first to get 100" (="best of 199") has a winning probability of just 0.23%.

8. Mar 31, 2014

### Stephen Tashi

OK, this illustrates my inutuition that if you have an advantage in the "1 out of 1" game then the longer games favor you even more. If you are at a disadvantage, you have a greater disadvantage in the longer games.

9. Mar 31, 2014

### willem2

The probability with 6 faces on the die with probabilities P1....P6 and the probabiltiy to get the 1 face to 100 first is:

$$\frac {{P_1}^{100}}{99!} \sum_{k_2 =0}^{99} \frac {{P_2}^{k_2}} {{k_2}!} \sum_{k_3 =0}^{99} \frac {{P_3}^{k_3}} {{k_3}!} \sum_{k_4 =0}^{99} \frac {{P_4}^{k_4}} {{k_4}!} \sum_{k_5 =0}^{99} \frac {{P_5}^{k_5}} {{k_5}!} \sum_{k_6 =0}^{99} \frac {{P_6}^{k_6} (99 + k_2+k_3+k_4+k_5+k_6)!} {{k_6}!}$$

I think I see how to compute this.

I think I can put the result of the innermost summation in a table with 400 elements, since it only depends on k2+k3+k4+k5.

The result of the 2 innermost summations only depends on k_2+k_3+k_4, so it can fit in a 300 element table. The 3 innermost summations fit in a 200 element table and the 4 innermost in a 100 element table.
All the tables will have to be recomputed when any of the Probabilities change.

This would make it fast enough to use some root finding algorithm to get P1.....P6 from S1.....S6. 6 variable root finding will be tricky tough.

10. Mar 31, 2014

### Staff: Mentor

That looks similar to the approach I suggested in post 2, thanks for the formula.

Inverting that looks way too messy, but numeric approaches should work as increasing/decreasing Pi will always increase/decrease Si.

11. Mar 31, 2014

### willem2

The first problem that pops up when implementing it, is that the numbers involved can get larger than the maximum of double precision floating point numbers. I could use a multipecision library, but I don't need the extra precision, just a larger exponent.

I need to compute 1000 table values, and sum 100 numbers for each table value, and then repeat 6 times for each Si to compute, and then repeat the computation many times to find the rootm so I'd rather not use a multiprecision library. Maybe I can store the numbers as logarithms?