Expected Number of Flips for Biased Coin | Homework Question

  • Thread starter Thread starter ephedyn
  • Start date Start date
Click For Summary
SUMMARY

The discussion focuses on calculating the expected number of flips for a biased coin that lands heads with probability p and tails with probability 1-p. The expected number of flips to match the initial flip is determined to be 2, while the expected number of flips to land a different side is calculated as (2p^2 - 2p + 1) / (p(1-p)). The participants also address the extreme cases where p approaches 0 or 1, noting that negative expectation values arise from incorrect application of L'Hôpital's rule. The correct interpretation reveals that the expected number of flips grows asymptotically in both cases.

PREREQUISITES
  • Understanding of geometric random variables
  • Familiarity with probability mass functions (pmf)
  • Basic knowledge of expectation calculations in probability theory
  • Concept of L'Hôpital's rule in calculus
NEXT STEPS
  • Study geometric random variables and their properties
  • Learn about probability mass functions and their applications
  • Explore advanced expectation calculations in probability theory
  • Review L'Hôpital's rule and its correct applications in limits
USEFUL FOR

Students studying probability theory, mathematicians interested in random variables, and educators looking for examples of expectation calculations in biased scenarios.

ephedyn
Messages
169
Reaction score
1
Homework Statement
A biased coin lands heads with probability p and tails with probability 1-p.

(1) What is the expected number of flips, after the initial flip, to land a match to the initial flip?
(2) What is the expected number of flips, after the initial flip, to land a different side from the initial flip? Comment on the extreme values of p.

The attempt at a solution
Without loss of generality, assume we land heads on the initial flip. Let N_H be the number of flips required until we land heads again. Since N_H is a geometric random variable, with pmf f(x) = p(1-p)^{x-1}, then E[N_H]=\sum^{\infty}_{x=1} x \cdot f(x)= \sum^{\infty}_{x=1} px(1-p)^{x-1}=\dfrac{p}{1-(1-p)^2}=\dfrac{1}{p} and similarly we have \dfrac{1}{1-p} for tails. Let H and T denote the event of landing heads and tails on the initial flip respectively.

So (1) for matching flips, the expected number of flips is P(H) \times \dfrac{1}{p} + P(T) \times \dfrac{1}{1-p} = p \times \dfrac{1}{p} + (1-p) \times \dfrac{1}{1-p}=2.

Similarly, (2) for different flips, the expected number of flips is P(T) \times \dfrac{1}{p} + P(H) \times \dfrac{1}{1-p} = \dfrac{1-p}{p} + \dfrac{p}{1-p}=\dfrac{p^2+(1-p)^2}{p(1-p)}=\dfrac{2p^2-2p+1}{p(1-p)}

For the extreme cases, by L'Hopital's rule, we have \lim_{p\rightarrow0} \dfrac{2p^2-2p+1}{p(1-p)}=\lim_{p\rightarrow0} \dfrac{4p-2}{1-2p}=-2 and similarly, \lim_{p\rightarrow1} \dfrac{4p-2}{1-2p}=-2

So I realize I must be doing something wrongly because I'm getting negative expectation values in the final part. Any guidance on my working?

Thanks!
 
Physics news on Phys.org
ephedyn said:
Homework Statement
A biased coin lands heads with probability p and tails with probability 1-p.

(1) What is the expected number of flips, after the initial flip, to land a match to the initial flip?
(2) What is the expected number of flips, after the initial flip, to land a different side from the initial flip? Comment on the extreme values of p.

The attempt at a solution
Without loss of generality, assume we land heads on the initial flip. Let N_H be the number of flips required until we land heads again. Since N_H is a geometric random variable, with pmf f(x) = p(1-p)^{x-1}, then E[N_H]=\sum^{\infty}_{x=1} x \cdot f(x)= \sum^{\infty}_{x=1} px(1-p)^{x-1}=\dfrac{p}{1-(1-p)^2}=\dfrac{1}{p} and similarly we have \dfrac{1}{1-p} for tails. Let H and T denote the event of landing heads and tails on the initial flip respectively.

So (1) for matching flips, the expected number of flips is P(H) \times \dfrac{1}{p} + P(T) \times \dfrac{1}{1-p} = p \times \dfrac{1}{p} + (1-p) \times \dfrac{1}{1-p}=2.

Similarly, (2) for different flips, the expected number of flips is P(T) \times \dfrac{1}{p} + P(H) \times \dfrac{1}{1-p} = \dfrac{1-p}{p} + \dfrac{p}{1-p}=\dfrac{p^2+(1-p)^2}{p(1-p)}=\dfrac{2p^2-2p+1}{p(1-p)}

For the extreme cases, by L'Hopital's rule, we have \lim_{p\rightarrow0} \dfrac{2p^2-2p+1}{p(1-p)}=\lim_{p\rightarrow0} \dfrac{4p-2}{1-2p}=-2 and similarly, \lim_{p\rightarrow1} \dfrac{4p-2}{1-2p}=-2

So I realize I must be doing something wrongly because I'm getting negative expectation values in the final part. Any guidance on my working?

Thanks!

L'Hospital's rule does not apply, because you do not have something like 0/0 or ∞/∞.
 
Ah! You're right! So I have the expected number grow asymptotically in both cases. Makes sense intuitively, since it should become nearly impossible to land the other side on a flip. Thanks!

Does the rest of my approach make sense? I'm not too convinced about taking P[H] \times E[N_H] + P[T] \times E[N_T] because a property that allows me to do this seems to be missing from my memory.
 
Fwiw, your answer can be written as p/(1-p) + (1-p)/p.
ephedyn said:
Does the rest of my approach make sense? I'm not too convinced about taking P[H] \times E[N_H] + P[T] \times E[N_T] because a property that allows me to do this seems to be missing from my memory.
Yes, that's fine. You can justify it by considering the prob that it takes N tosses given the outcome of the initial toss, P[N|H], P[N|T]. P[N] = P[N|H]P[H]+ P[N|T]P[T]. E[N|H] = ƩNP[N|H], etc.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 41 ·
2
Replies
41
Views
8K
  • · Replies 2 ·
Replies
2
Views
3K