MrAnchovy said:
Well in order to be rigorous it would be nice to first justify the use of a summation by showing that the sample space is at most countably infinite.
Seriously? Office_Shredder's
Pk is the probability of busting out on the
kth flip. It is indexed by a positive integer, the coin flip number. You either flip the coin or you don't. There are no halfsies on coin flips, let alone irrational number of coin flips. Of course it's countable.
I agree that if this sum is one, the player almost surely (i.e. with probability 1) will run out of coins. However I do not agree that this implies that the player will, with probability one, play for a finite number of rounds.
I'll modify the rules of the game a bit to make it easier to analyze. What I want is a way to make sense of continuing to flip coins after the player has busted out.
Denote
sk as the size of the player's stack after the
kth coin flip, with an initial value of
s0=1. After the
kth coin flip, the stack size remains unchanged if
sk=0. If
sk is positive, the stack size increases by
n-1 if the outcome of the
kth was heads but decreases by one if the outcome was tails. Thus
sk+1=0 if
sk=0, regardless of the outcome of the flip. If
sk>0,
sk+1=
sk+
n-1 on heads,
sk-1 on tails.
Now for some events. Denote
bk as the set of all coin toss sequences of length
k that result in going bust going bust on exactly the
kth flip. Denote
Bk as the set of all coin toss sequences of length
k that resulted in going bust before the
kth flip. Finally, denote
Ak as denoting the set of all coin toss sequences of length
k that result in
sk+1>0.
Clearly,
Bk,
bk, and
Ak are mutually exclusive events that collectively exhaust the sample space Ω
k, the set of all possible coin toss sequences of length
k. In other words, P(
Ak)=1-(P(
Bk)+P(
bk)). If P(
Bk)+P(
bk)→1 as
k→∞ then P(
Ak) must become vanishingly small as
k→∞.
It should be equally clear that
bk2 bk1 are also mutually exclusive events for all
k2≠
k1. Thus P(
Bk)=∑
r<kP(
br). Note that Office_Shredder's
Pk is just P(
bk), and also note that his sum is valid because of the mutual exclusivity of going bust on flip number
k1 versus going bust on some other flip number
k2.
I agree that this was the intention of the challenge, as was confirmed by Office_Shredder in post #, however that is not how the challenge was originally posed.
Yes, it is. You just read stuff into it that wasn't there.
I think I will modify my answer to ## n > 1 ## and ## p > 0 ##. There are now uncountably many ways to keep playing for ever, and the probability that he is still playing after k rounds is ## ≥ p^k + k p^{k-1}(1-p) + \binom 2 k p^{k-2}(1-p)^2 ... > 0 ##.
There are two things wrong here:
1. There are only a countable number of ways to keep playing forever.
2. Your probability is wrong.
I'll look at item #2 first. You are clearly using a binomial expansion. That is not correct. Look at what your expression says for the probability that the player is still playing after one round. Your expression evaluates to one. It evaluates to one for all k! You are ignoring the fact that the player stops playing after going bust (or in my alteration of the game, the play idly watches meaningless coin flips after going bust).
Regarding the countability of the sample space, first off, it doesn't matter if the coin toss sequences that form my sets
bk,
Bk, and
Ak become uncountable in the limit
k→∞. The sets themselves are mutually exclusive, and the set of all {
bk,
Bk,
Ak} are countable. Secondly, there are only a countable number of infinitely long coin toss sequences that lead to the player playing forever. The sample space in the original problem is countable.
I'll use an ordered string comprising the symbols
h and
t to label a sequence of coin tosses of length
k. For example, the string
t designates tails (and hence going bust) on the very first toss, while
h designates getting heads (and hence continuing the game). Note that the only string that starts with
t is the string
t itself. The string
th is not a valid string, in any game with reward
n. The string terminates when the player goes bust. In a valid string
S, the number of
ts in all initial proper substrings
s of
S is at most
n-1 times the number of
hs in
s. (An initial proper substring
s of a string
S is a substring that starts at the first character of the string and ends before
s ends.) What this means is that you can't use a diagonalization argument to prove uncountability. If the
kth character of the
kth string is an
h, flipping that
h to a
t might mean game over. You have to keep that
h if you want to continue your diagonalization argument, and now you can't guarantee that your generated string will be different from every string in my enumerated list.