# Flipping unfair coins (harder)

1. Aug 16, 2009

### pagheca

Hi,

I would like to solve this version of the unfair coin problem.

The coin has a probability p for head and 1-p for tail as usual, but the winner is the person first to reach the number of wins W.

For example, one player choose head, and the other choose tail. The players flip the coin until someone reach W=5 wins.

For W=2 we have the following possible outcomes:

AA - A wins
ABA - A wins
ABB - B wins
BAA - A wins
BAB - B wins
BB - B wins

Note that the case AAA and BBB doesn't make sense because at the second flip the one player already won.

For W =3 we have:

AAA - A wins
AABA - A wins
AABBA - A wins
AABBB - B wins
ABAA - A wins
ABABA - A wins
ABABB - B wins
ABBAA - A wins
ABBAB - B wins
ABBB - B wins
BAAA - A wins
BAABA - B wins
BAABB - B wins
BABAA - A wins
BABAB - B wins
BABB - B wins
BBAAA - A wins
BBAAB - B wins
BBAB - B wins
BBB - B wins

etc...

The maximum number of flips required to win is clearly W + (W-1) = 2 * W-1.

The number of possible outcomes, if I got it right, should be 2 * [sum(i=1,W-1) {W*i)+1], that is W^2*(W-1)+2.
That is 6 for W=2, and 20 for W=3.

However, I couldn't find the probability of winning for A (and B) as a function of W and p: A(W,p).

Do you know of any algorithm for solving this problem? I'm not an expert but I guess this is a typical Markov's chain problem.

Thanks
pagheca

Last edited: Aug 16, 2009
2. Aug 16, 2009

### mXSCNT

Well, suppose A wins. Therefore the winning string must include W repetitions of A, and k < W repetitions of B, where the last character is A. The probability that an infinite sequence of flips begins with that string is $$p^W (1-p)^k$$. So the probability that an infinite sequence of flips begins with a sequence where A wins is
$$\sum_{k=0}^{W-1} {W+k-1 \choose k} p^W (1-p)^k$$
That might simplify further.

Last edited: Aug 16, 2009
3. Aug 17, 2009

### pagheca

thanks very much. That's a very good way to solve it.

Actually I arrived to the same conclusion by examining the case for W = 2 and 3. For 2, we have

AA
ABA
ABB
BAA
BAB
BB

A wins in the cases 1, 2 and 4. The total probability for them is

p^2 + 2*p^2*(1-p)

by working on W=3 in the same way, at the end I got the general rule, but your explanation is way more general and elegant. Thank you again.

Now I'll tell why I was thinking to this to anyone interested.

I play table tennis with my son. My feeling was that I'm not that bad respect to him. Therefore, it was difficult for me to explain why I loose almost any game and almost all our "matches". Now, at least I know why.

We follow the standard modern rules. The winner of a game is the first to reach 11 points (I do not consider the fact that if both the players reach 10, the game is extended to 12 etc. in my analysis). The "match" is won by the first to win 3 games.

Now, suppose that my son wins me 60% of each serve. This doesn't look like a big difference, but actually it makes a huge difference.

Applying the above algorithm, and not considering differential tiredness, psychological effects (e.g. frustration on parent's side...), etc., my son will win 82.6% of the games and 96%, yes: 96%, of the matches! This is exactly what happens in reality...

This of course applies to ANY competition with these "random walk" rules, like tennis, F1, sailing, etc. and may explain why we easily get the impression there is an outstanding player at any time, also if actually the real difference between him and the others may be not so large: it's a matter of rules more than everything else.

For example, not considering the psycho effects, it is really difficult for a minor player to win a tennis game against someone even slightly better than him.

Pagheca

Last edited: Aug 17, 2009