- #1

- 299

- 12

*Introduction to Probability*, they say that fair

^{*}games with an infinite number of states (Markov states, if the game is a Markov process) need not remain fair if an unlimited number of plays is allowed. The example they give is two people, Peter and Paul, tossing a fair coin, betting $.01 per toss, until Peter is ahead one cent. If they both have unlimited funds, then Peter will surely end up 1 penny ahead.

I've got two questions. First, shouldn't we say that Peter will

*almost surely*end up 1 penny ahead, since there is the possibility (probability 0) that he loses the first toss and never does better than get back to even? Second, does this game even have an expected value? If it does, and if it is zero, then why wouldn't the game still be fair?

I'm guessing the game does not have an expected value (of Peter's gain), since calculating it using different limiting processes gives different results. For example, if you first assume that Peter can only lose some finite amount B, then for a fair coin the expected gain is zero, no matter how large B is. If, however, you allow the coin to be biased, and let B tend to infinity, then Peter's expected gain tends toward minus infinity or +1 depending on whether the coin is biased in his favor or not. Finally, if you allow B to increase to infinity and the probability of "Heads" to approach 1/2 simultaneously in some way, then the answer depends on whether you apprach p(Heads)=1/2 from the left or the right.

* A game is said to be fair if your expected fortune after one round of the game is equal to your starting fortune.