Expected number of coin flips before N consecutive heads?

  • Thread starter Eclair_de_XII
  • Start date
In summary, the problem involves tossing a coin repeatedly and finding the expected number of tosses until n consecutive heads appear. The equation ##E(X|A)=\sum_{x} x P([X=x]|A)## is not useful for this problem. The states can be defined as 0 for start or start over, and k for having just gotten the kth head in a row. The number of tosses to get n heads in a row starting in state j is denoted by Nj. The equation for E(X|Hj) is 1 + pE(X|Hj+1) + qE(X|H0), with E(X|Hn) = 0. This problem is
  • #1
Eclair_de_XII
1,083
91

Homework Statement


"A coin is tossed repeatedly, with ##\text{Pr} \{Heads\} = p## and ##\text{Pr} \{Tails\} = 1-p##. Let ##X## represent the number of coin flips until ##n## consecutive coin tosses all reveal a heads. Find ##E(X)##."

Homework Equations


If ##P(A)>0##,
##E(X|A)=\sum_{x} x P([X=x]|A)##

My professor has informed me that this equation will not be of any real use to me in this problem.

The Attempt at a Solution


First, I let ##H_k## denote the event of having achieved ##k## consecutive heads, and ##T_k## denote the event of getting a tails on the ##k##-th toss. So, I have to calculate ##E(X|H_n)## and ##E(X|H_n^c)##, in order to express the expected value as ##E(X)=E(X|H_n)p^n + E(X|H_n^c)(1-p^n)##. So I first calculate ##E(X|H_j)## for ##0\leq j <n##, but got lost as to how to proceed. I think I would need a second equation to go along with this one, to solve for ##E(X|H_j)##?

##E(X|H_j)=[1+E(X|H_{j+1})]p+E(X|H_0)(1-p)##

Feel free to decline to answer this, or redirect me to my previous topic about ##r## consecutive heads before ##s## consecutive tails. It's very similar, I should think.
 
Physics news on Phys.org
  • #2
What type of distribution do you think this is?
 
  • #3
Well, I would hazard a guess at "geometric distribution", if I let ##Pr\{\text{Heads n consecutive times}\}=p^n##, and say that ##X \space \text{~} \space Geo(Pr\{\text{Heads n consecutive times}\})=Geo(p^n)##. Basically, I think ##X## is geometrically distributed in the respect that trials of ##n## coin tosses are conducted until we get a trial of ##n## coin tosses with all heads. But then again, if for example, ##n-5## heads occur at the end of one trial, and ##5## heads occur at the beginning of the next trial, then that wouldn't be counted, so I don't think that's right.
 
  • #4
Try to work with strings H*T. I apologize if you already are.
 
  • #5
Eclair_de_XII said:

Homework Statement


"A coin is tossed repeatedly, with ##\text{Pr} \{Heads\} = p## and ##\text{Pr} \{Tails\} = 1-p##. Let ##X## represent the number of coin flips until ##n## consecutive coin tosses all reveal a heads. Find ##E(X)##."

Homework Equations


If ##P(A)>0##,
##E(X|A)=\sum_{x} x P([X=x]|A)##

My professor has informed me that this equation will not be of any real use to me in this problem.

The Attempt at a Solution


First, I let ##H_k## denote the event of having achieved ##k## consecutive heads, and ##T_k## denote the event of getting a tails on the ##k##-th toss. So, I have to calculate ##E(X|H_n)## and ##E(X|H_n^c)##, in order to express the expected value as ##E(X)=E(X|H_n)p^n + E(X|H_n^c)(1-p^n)##. So I first calculate ##E(X|H_j)## for ##0\leq j <n##, but got lost as to how to proceed. I think I would need a second equation to go along with this one, to solve for ##E(X|H_j)##?

##E(X|H_j)=[1+E(X|H_{j+1})]p+E(X|H_0)(1-p)##

Feel free to decline to answer this, or redirect me to my previous topic about ##r## consecutive heads before ##s## consecutive tails. It's very similar, I should think.
Your last equation is almost, but not quite right. Let the states be ##0 = ## "start, or start over", and ##k = ## "have just gotten ##k##th head in a row", for ##k=1,2, \ldots, n-1.## Let ##N_j = ## number of tosses to get ##n## heads in a row starting in state ##j = 0,1,2, \ldots, n-1## (and including the last toss). Then for ##j < n## we have
$$N_j = 1 + \begin{cases} N'_{j+1}& \text{if heads} \\N'_0 &\text{if tails} \end{cases},$$
where ##N'_{n} \equiv 0.## Here, the ##N'_j## for ##j=0,1,\ldots, n-1## are independent "copies" of the ##N_j##---that is, they have the same probability distributions (hence the same means) as the ##N_j.## Note that the "+1" is common to both cases, because it counts the next toss.

Thus, in your notation we have ##E(X|H_j) = 1 + p E(X|H_{j+1}) + q E(X|H_0)## for ##0 \leq j \leq n-1,## with ##E(X|H_n) = 0.##
 
Last edited:
  • #6
Eclair_de_XII said:
or redirect me to my previous topic about ##r## consecutive heads before ##s## consecutive tails. It's very similar, I should think.

This problem is intimately connected to your prior one. To make the connections properly needs more machinery though. In general, not an easy problem for most people to get their heads around, though there is a simple closed form solution.
= = = = =
Do you know anything about markov chains, renewal theory, or Ordinary Generating Functions? I strongly suspect you at least know about OGFs... though they are my least favorite way of going after this problem. (There's a martingale approach as well but I believe it's outside the scope.)

Note that if you take the OGF (or renewal) approach, then one way to solve this is quite similar to your last problem, except instead of using "first step analysis", you use "last step analysis"...

- - - -
edit:
it occurred to me you could directly build off your prior problem in the sense of having a game where player 1 has to get n consecutive heads to win and player 2 has to get one tails to win, then the game starts over. From prior problem you know ##p## and ##q##, and you can apply total expectation / draw a two branch tree here, to get expected duration of a game. (One branch of the expectation will require calculating expected duration of a finite geometrically distributed game -- so be careful). Once you have expected duration of the game, you should be able to re-arrange equations and solve.

This may be what you were trying to do in your original post, and what Ray was getting at, though there's a lot of notation so maybe it's a bit different.

In any case, it pays dividends to try and solve this problem a couple of different ways.
 
Last edited:
  • Like
Likes PeroK
  • #7
Sorry if I don't respond to everyone in this topic; sometimes, I just feel so overwhelmed by the amount of material being explained here.

Ray Vickson said:
Thus, in your notation we have ##E(X|H_j) = 1 + p E(X|H_{j+1}) + q E(X|H_0)## for ##0≤j≤n−1##, with ##E(X|H_n)=0##.

So here's what I have so far:

##E(X|H_{n-1})=1+qE(X|H_0)##
##E(X|H_{n-2})=p[1+qE(X|H_0)]+1+qE(X|H_0)=(1+p)(1+qE(X|H_0))##
##E(X|H_{n-3})=p(1+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p+1)(1+qE(X|H_0))##

In general: ##E(X|H_s)=(1+qE(X|H_0))\sum_{k=0}^{n-(s+1)} p^k## for ##0\leq s < n##

So: ##E(X|H_0)=(1+qE(X|H_0))\sum_{k=0}^{n-1} p^k=\sum_{k=0}^{n-1} p^k+(qE(X|H_0))\sum_{k=0}^{n-1} p^k##
and: ##E(X|H_0)=\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k}##

So we have:

##E(X|H_s)=[1+q(\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k})]\sum_{k=0}^{n-(s+1)} p^k=(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k## for ##0\leq s < n##

I have to say that the denominator of the first term in this equation looks a lot like the complement of a geometric distribution, as was pointed out at the beginning of this topic. Though it also worries me that the denominator is zero, so I have a strong feeling that this isn't correct. I think I might have messed up when calculating ##E(X|H_s)##.
 
Last edited:
  • #8
Eclair_de_XII said:
Sorry if I don't respond to everyone in this topic; sometimes, I just feel so overwhelmed by the amount of material being explained here.
So here's what I have so far:

##E(X|H_{n-1})=1+qE(X|H_0)##
##E(X|H_{n-2})=p[1+qE(X|H_0)]+1+qE(X|H_0)=(1+p)(1+qE(X|H_0))##
##E(X|H_{n-3})=p(1+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p)(1+qE(X|H_0))+1+qE(X|H_0)=(p^2+p+1)(1+qE(X|H_0))##

In general: ##E(X|H_s)=(1+qE(X|H_0))\sum_{k=0}^{n-(s+1)} p^k## for ##0\leq s < n##

So: ##E(X|H_0)=(1+qE(X|H_0))\sum_{k=0}^{n-1} p^k=\sum_{k=0}^{n-1} p^k+(qE(X|H_0))\sum_{k=0}^{n-1} p^k##
and: ##E(X|H_0)=\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k}##

So we have:

##E(X|H_s)=[1+q(\frac{\sum_{k=0}^{n-1} p^k}{1-q\sum_{k=0}^{n-1} p^k})]\sum_{k=0}^{n-(s+1)} p^k=(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k## for ##0\leq s < n##

I have to say that the denominator of the first term in this equation looks a lot like the complement of a geometric distribution, as was pointed out at the beginning of this topic. Though it also worries me that the denominator is zero, so I have a strong feeling that this isn't correct. I think I might have messed up when calculating ##E(X|H_s)##.

Sums like ##\sum_{k=0}^m p^k## are completely standard, and are (or used to be) given in early algebra courses. Alternatively, for ##|p| < 1## you can write that
$$\sum_{k=0}^m p^k = \sum_{k=0}^{\infty} p^k - \sum_{k=m+1}^{\infty} p^k,$$
and the last summation has the form ##p^{m+1} \sum_{j=0}^{\infty} p^j.##
 
  • #9
But I'm very concerned about the the first term in the expression: ##(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k## for ##0≤s<n##.

It has a zero denominator if it's distributed geometrically.
 
  • #10
Eclair_de_XII said:
But I'm very concerned about the the first term in the expression: ##(\frac{1}{1-q\sum_{k=0}^{n-1} p^k})\sum_{k=0}^{n-(s+1)} p^k## for ##0≤s<n##.

It has a zero denominator if it's distributed geometrically.

No: ##1-q \sum_{k=0}^{n-1} p^k \neq 0.## Find out the formula for ##\sum_{k=0}^{n-1} p^k## and check this for yourself. The ##p## and ##q = 1-p## are just some numbers in ##(0,1)##, so do not have any kind of "distribution" at all. Besides, a geometric distribution goes on forever, but your summation stops at a finite value ##k = n-1.## You would be right about a zero denominator if you had ##n = \infty,## but you don't have that.
 
  • #11
Ray Vickson said:
Besides, a geometric distribution goes on forever, but your summation stops at a finite value ##k=n−1##.

Okay, so let me try and compare: ##q\sum_{k=0}^\infty p^k = q(1+\frac{p}{1-p})=q(\frac{1}{1-p})=1##, whereas ##q\sum_{k=0}^{n-1} p^k=q(1+\frac{1-p^{n-1}}{1-p})=q+1-p^{n-1}\neq 0##. That makes a little more sense, if I'm telescoping my sums correctly.
 
  • #12
Well, this took like two days to figure out, and it would have probably taken longer had I not this forum to help me.

So thank you all for your help, even though I was unable to respond to all of you.
 
  • #13
Eclair_de_XII said:
Well, this took like two days to figure out, and it would have probably taken longer had I not this forum to help me.

So thank you all for your help, even though I was unable to respond to all of you.

Please show your final answer. Many on the thread would appreciate at least that much. Furthermore, there are some important and simple links to your prior problem that I want to suggest with the final answer in hand... up until now it just looks like a bunch of symbol manipulation which isn't great for understanding or memory.
 
  • #14
StoneTemplePython said:
Please show your final answer.

##E(X)=\sum_{s=0}^{n-1} E(X|H_s)P(H_s)=\sum_{s=0}^{n-1} p^s (\frac{\sum_{k=0}^{n-(s+1)} p^k}{1-q\sum_{k=0}^{n-1} p^k})##
 
  • #15
Eclair_de_XII said:
##E(X)=\sum_{s=0}^{n-1} E(X|H_s)P(H_s)=\sum_{s=0}^{n-1} p^s (\frac{\sum_{k=0}^{n-(s+1)} p^k}{1-q\sum_{k=0}^{n-1} p^k})##

which simplifies to...? There's no need for a series in the final answer.
- - - -
What I'm getting at here, btw, is you now have a process for calculating ##\bar{X}## say for a fun of ##r## heads. You can re-run your process to calculate ##\bar{Y}## for average time until a run of ##s## tails.

what you should confirm for yourself is that the results from your prior problem line up here. In particular:

##P(E)=\frac{p^r+qp^{r-1}(1-q^{s-1})}{1+(1-q^{s-1})(p^{r-1}-1)} =\frac{\frac{1}{\bar{X}}}{\frac{1}{\bar{X}} + \frac{1}{\bar{Y}}} = \frac{\bar{Y}}{\bar{X} + \bar{Y}}##

In general this result holds for various streaks / runs problems, though if there is overlap in the pattern (say HTH vs HHH) then additional work is required -- it is easy in this case because ##TT...T## is disjoint / has no overlap with ##HH...H## .
 
  • #16
StoneTemplePython said:
which simplifies to...?

The best that I could simplify it to is: ##E(X)=p^{-n}\sum_{s=0}^{n-1} \sum_{k=0}^{n-(s+1)}p^{k+s}##, and I think I'm going to leave it at that. I have many other assignments due this week that I must get started on.
 

What does "Expected number of coin flips before N consecutive heads" mean?

The expected number of coin flips before N consecutive heads refers to the average number of times a coin must be flipped before getting N consecutive heads in a row.

Why is it important to calculate the expected number of coin flips before N consecutive heads?

Knowing the expected number of coin flips can help us understand the probability of getting N consecutive heads and can be useful in predicting outcomes in various scenarios, such as gambling or statistical analysis.

How is the expected number of coin flips before N consecutive heads calculated?

The expected number of coin flips before N consecutive heads can be calculated using a mathematical formula that takes into account the probability of getting N consecutive heads in a row.

What factors influence the expected number of coin flips before N consecutive heads?

The expected number of coin flips can be influenced by various factors such as the number of consecutive heads needed, the number of times the coin has been flipped, and the probability of getting heads on each flip.

Is the expected number of coin flips before N consecutive heads the same for all coins?

No, it can vary depending on the properties of the coin, such as weight distribution or shape, which can affect the probability of getting heads on each flip.

Similar threads

  • Calculus and Beyond Homework Help
2
Replies
37
Views
4K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Programming and Computer Science
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
41
Views
4K
  • Precalculus Mathematics Homework Help
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Precalculus Mathematics Homework Help
Replies
12
Views
2K
Back
Top