# Expected number of coin flips before N consecutive heads?

## Homework Statement

"A coin is tossed repeatedly, with $\text{Pr} \{Heads\} = p$ and $\text{Pr} \{Tails\} = 1-p$. Let $X$ represent the number of coin flips until $n$ consecutive coin tosses all reveal a heads. Find $E(X)$."

## Homework Equations

If $P(A)>0$,
$E(X|A)=\sum_{x} x P([X=x]|A)$

My professor has informed me that this equation will not be of any real use to me in this problem.

## The Attempt at a Solution

First, I let $H_k$ denote the event of having achieved $k$ consecutive heads, and $T_k$ denote the event of getting a tails on the $k$-th toss. So, I have to calculate $E(X|H_n)$ and $E(X|H_n^c)$, in order to express the expected value as $E(X)=E(X|H_n)p^n + E(X|H_n^c)(1-p^n)$. So I first calculate $E(X|H_j)$ for $0\leq j <n$, but got lost as to how to proceed. I think I would need a second equation to go along with this one, to solve for $E(X|H_j)$?

$E(X|H_j)=[1+E(X|H_{j+1})]p+E(X|H_0)(1-p)$

Feel free to decline to answer this, or redirect me to my previous topic about $r$ consecutive heads before $s$ consecutive tails. It's very similar, I should think.

Related Calculus and Beyond Homework Help News on Phys.org
What type of distribution do you think this is?

Well, I would hazard a guess at "geometric distribution", if I let $Pr\{\text{Heads n consecutive times}\}=p^n$, and say that $X \space \text{~} \space Geo(Pr\{\text{Heads n consecutive times}\})=Geo(p^n)$. Basically, I think $X$ is geometrically distributed in the respect that trials of $n$ coin tosses are conducted until we get a trial of $n$ coin tosses with all heads. But then again, if for example, $n-5$ heads occur at the end of one trial, and $5$ heads occur at the beginning of the next trial, then that wouldn't be counted, so I don't think that's right.

verty
Homework Helper
Try to work with strings H*T. I apologize if you already are.

Ray Vickson
Homework Helper
Dearly Missed

## Homework Statement

"A coin is tossed repeatedly, with $\text{Pr} \{Heads\} = p$ and $\text{Pr} \{Tails\} = 1-p$. Let $X$ represent the number of coin flips until $n$ consecutive coin tosses all reveal a heads. Find $E(X)$."

## Homework Equations

If $P(A)>0$,
$E(X|A)=\sum_{x} x P([X=x]|A)$

My professor has informed me that this equation will not be of any real use to me in this problem.

## The Attempt at a Solution

First, I let $H_k$ denote the event of having achieved $k$ consecutive heads, and $T_k$ denote the event of getting a tails on the $k$-th toss. So, I have to calculate $E(X|H_n)$ and $E(X|H_n^c)$, in order to express the expected value as $E(X)=E(X|H_n)p^n + E(X|H_n^c)(1-p^n)$. So I first calculate $E(X|H_j)$ for $0\leq j <n$, but got lost as to how to proceed. I think I would need a second equation to go along with this one, to solve for $E(X|H_j)$?

$E(X|H_j)=[1+E(X|H_{j+1})]p+E(X|H_0)(1-p)$

Feel free to decline to answer this, or redirect me to my previous topic about $r$ consecutive heads before $s$ consecutive tails. It's very similar, I should think.
Your last equation is almost, but not quite right. Let the states be $0 =$ "start, or start over", and $k =$ "have just gotten $k$th head in a row", for $k=1,2, \ldots, n-1.$ Let $N_j =$ number of tosses to get $n$ heads in a row starting in state $j = 0,1,2, \ldots, n-1$ (and including the last toss). Then for $j < n$ we have
$$N_j = 1 + \begin{cases} N'_{j+1}& \text{if heads} \\N'_0 &\text{if tails} \end{cases},$$
where $N'_{n} \equiv 0.$ Here, the $N'_j$ for $j=0,1,\ldots, n-1$ are independent "copies" of the $N_j$---that is, they have the same probability distributions (hence the same means) as the $N_j.$ Note that the "+1" is common to both cases, because it counts the next toss.

Thus, in your notation we have $E(X|H_j) = 1 + p E(X|H_{j+1}) + q E(X|H_0)$ for $0 \leq j \leq n-1,$ with $E(X|H_n) = 0.$

Last edited:
StoneTemplePython
Gold Member
2019 Award
or redirect me to my previous topic about $r$ consecutive heads before $s$ consecutive tails. It's very similar, I should think.
This problem is intimately connected to your prior one. To make the connections properly needs more machinery though. In general, not an easy problem for most people to get their heads around, though there is a simple closed form solution.
= = = = =
Do you know anything about markov chains, renewal theory, or Ordinary Generating Functions? I strongly suspect you at least know about OGFs... though they are my least favorite way of going after this problem. (There's a martingale approach as well but I believe it's outside the scope.)

Note that if you take the OGF (or renewal) approach, then one way to solve this is quite similar to your last problem, except instead of using "first step analysis", you use "last step analysis"...

- - - -
edit:
it occurred to me you could directly build off your prior problem in the sense of having a game where player 1 has to get n consecutive heads to win and player 2 has to get one tails to win, then the game starts over. From prior problem you know $p$ and $q$, and you can apply total expectation / draw a two branch tree here, to get expected duration of a game. (One branch of the expectation will require calculating expected duration of a finite geometrically distributed game -- so be careful). Once you have expected duration of the game, you should be able to re-arrange equations and solve.

This may be what you were trying to do in your original post, and what Ray was getting at, though there's a lot of notation so maybe it's a bit different.

In any case, it pays dividends to try and solve this problem a couple of different ways.

Last edited:
Sorry if I don't respond to everyone in this topic; sometimes, I just feel so overwhelmed by the amount of material being explained here.

Thus, in your notation we have $E(X|H_j) = 1 + p E(X|H_{j+1}) + q E(X|H_0)$ for $0≤j≤n−1$, with $E(X|H_n)=0$.
So here's what I have so far:

$E(X|H_{n-1})=1+qE(X|H_0)$
$E(X|H_{n-2})=p[1+qE(X|H_0)]+1+qE(X|H_0)=(1+p)(1+qE(X|H_0))$
$E(X|H_{n-3})=p(1+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p+1)(1+qE(X|H_0))$

In general: $E(X|H_s)=(1+qE(X|H_0))\sum_{k=0}^{n-(s+1)} p^k$ for $0\leq s < n$

So: $E(X|H_0)=(1+qE(X|H_0))\sum_{k=0}^{n-1} p^k=\sum_{k=0}^{n-1} p^k+(qE(X|H_0))\sum_{k=0}^{n-1} p^k$
and: $E(X|H_0)=\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k}$

So we have:

$E(X|H_s)=[1+q(\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k})]\sum_{k=0}^{n-(s+1)} p^k=(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k$ for $0\leq s < n$

I have to say that the denominator of the first term in this equation looks a lot like the complement of a geometric distribution, as was pointed out at the beginning of this topic. Though it also worries me that the denominator is zero, so I have a strong feeling that this isn't correct. I think I might have messed up when calculating $E(X|H_s)$.

Last edited:
Ray Vickson
Homework Helper
Dearly Missed
Sorry if I don't respond to everyone in this topic; sometimes, I just feel so overwhelmed by the amount of material being explained here.

So here's what I have so far:

$E(X|H_{n-1})=1+qE(X|H_0)$
$E(X|H_{n-2})=p[1+qE(X|H_0)]+1+qE(X|H_0)=(1+p)(1+qE(X|H_0))$
$E(X|H_{n-3})=p(1+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p+1)(1+qE(X|H_0))$

In general: $E(X|H_s)=(1+qE(X|H_0))\sum_{k=0}^{n-(s+1)} p^k$ for $0\leq s < n$

So: $E(X|H_0)=(1+qE(X|H_0))\sum_{k=0}^{n-1} p^k=\sum_{k=0}^{n-1} p^k+(qE(X|H_0))\sum_{k=0}^{n-1} p^k$
and: $E(X|H_0)=\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k}$

So we have:

$E(X|H_s)=[1+q(\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k})]\sum_{k=0}^{n-(s+1)} p^k=(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k$ for $0\leq s < n$

I have to say that the denominator of the first term in this equation looks a lot like the complement of a geometric distribution, as was pointed out at the beginning of this topic. Though it also worries me that the denominator is zero, so I have a strong feeling that this isn't correct. I think I might have messed up when calculating $E(X|H_s)$.
Sums like $\sum_{k=0}^m p^k$ are completely standard, and are (or used to be) given in early algebra courses. Alternatively, for $|p| < 1$ you can write that
$$\sum_{k=0}^m p^k = \sum_{k=0}^{\infty} p^k - \sum_{k=m+1}^{\infty} p^k,$$
and the last summation has the form $p^{m+1} \sum_{j=0}^{\infty} p^j.$

But I'm very concerned about the the first term in the expression: $(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k$ for $0≤s<n$.

It has a zero denominator if it's distributed geometrically.

Ray Vickson
Homework Helper
Dearly Missed
But I'm very concerned about the the first term in the expression: $(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k$ for $0≤s<n$.

It has a zero denominator if it's distributed geometrically.
No: $1-q \sum_{k=0}^{n-1} p^k \neq 0.$ Find out the formula for $\sum_{k=0}^{n-1} p^k$ and check this for yourself. The $p$ and $q = 1-p$ are just some numbers in $(0,1)$, so do not have any kind of "distribution" at all. Besides, a geometric distribution goes on forever, but your summation stops at a finite value $k = n-1.$ You would be right about a zero denominator if you had $n = \infty,$ but you don't have that.

Besides, a geometric distribution goes on forever, but your summation stops at a finite value $k=n−1$.
Okay, so let me try and compare: $q\sum_{k=0}^\infty p^k = q(1+\frac{p}{1-p})=q(\frac{1}{1-p})=1$, whereas $q\sum_{k=0}^{n-1} p^k=q(1+\frac{1-p^{n-1}}{1-p})=q+1-p^{n-1}\neq 0$. That makes a little more sense, if I'm telescoping my sums correctly.

Well, this took like two days to figure out, and it would have probably taken longer had I not this forum to help me.

So thank you all for your help, even though I was unable to respond to all of you.

StoneTemplePython
Gold Member
2019 Award
Well, this took like two days to figure out, and it would have probably taken longer had I not this forum to help me.

So thank you all for your help, even though I was unable to respond to all of you.
Please show your final answer. Many on the thread would appreciate at least that much. Furthermore, there are some important and simple links to your prior problem that I want to suggest with the final answer in hand... up until now it just looks like a bunch of symbol manipulation which isn't great for understanding or memory.

$E(X)=\sum_{s=0}^{n-1} E(X|H_s)P(H_s)=\sum_{s=0}^{n-1} p^s (\frac{\sum_{k=0}^{n-(s+1)} p^k}{1-q\sum_{k=0}^{n-1} p^k})$

StoneTemplePython
Gold Member
2019 Award
$E(X)=\sum_{s=0}^{n-1} E(X|H_s)P(H_s)=\sum_{s=0}^{n-1} p^s (\frac{\sum_{k=0}^{n-(s+1)} p^k}{1-q\sum_{k=0}^{n-1} p^k})$
which simplifies to.....? There's no need for a series in the final answer.
- - - -
What I'm getting at here, btw, is you now have a process for calculating $\bar{X}$ say for a fun of $r$ heads. You can re-run your process to calculate $\bar{Y}$ for average time until a run of $s$ tails.

what you should confirm for yourself is that the results from your prior problem line up here. In particular:

$P(E)=\frac{p^r+qp^{r-1}(1-q^{s-1})}{1+(1-q^{s-1})(p^{r-1}-1)} =\frac{\frac{1}{\bar{X}}}{\frac{1}{\bar{X}} + \frac{1}{\bar{Y}}} = \frac{\bar{Y}}{\bar{X} + \bar{Y}}$

In general this result holds for various streaks / runs problems, though if there is overlap in the pattern (say HTH vs HHH) then additional work is required -- it is easy in this case because $TT...T$ is disjoint / has no overlap with $HH...H$ .

which simplifies to.....?
The best that I could simplify it to is: $E(X)=p^{-n}\sum_{s=0}^{n-1} \sum_{k=0}^{n-(s+1)}p^{k+s}$, and I think I'm going to leave it at that. I have many other assignments due this week that I must get started on.