Expected Number of Flips for Biased Coin | Homework Question

  • Thread starter Thread starter ephedyn
  • Start date Start date
Click For Summary

Homework Help Overview

The problem involves a biased coin that lands heads with probability p and tails with probability 1-p. Participants are discussing the expected number of flips required to land either a match or a different side from the initial flip, along with considerations of extreme values of p.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants are examining the expected values for matching and different flips, using geometric random variables and probability mass functions. Questions arise regarding the validity of their calculations, particularly when approaching extreme values of p.

Discussion Status

Some participants express uncertainty about their calculations leading to negative expectation values and seek guidance on their reasoning. Others provide insights into the justification of their approaches and discuss the applicability of certain mathematical properties.

Contextual Notes

There is mention of L'Hospital's rule in relation to limits, with some participants questioning its applicability in this context. The discussion reflects a mix of attempts to clarify mathematical reasoning and explore underlying assumptions.

ephedyn
Messages
169
Reaction score
1
Homework Statement
A biased coin lands heads with probability p and tails with probability 1-p.

(1) What is the expected number of flips, after the initial flip, to land a match to the initial flip?
(2) What is the expected number of flips, after the initial flip, to land a different side from the initial flip? Comment on the extreme values of p.

The attempt at a solution
Without loss of generality, assume we land heads on the initial flip. Let N_H be the number of flips required until we land heads again. Since N_H is a geometric random variable, with pmf f(x) = p(1-p)^{x-1}, then E[N_H]=\sum^{\infty}_{x=1} x \cdot f(x)= \sum^{\infty}_{x=1} px(1-p)^{x-1}=\dfrac{p}{1-(1-p)^2}=\dfrac{1}{p} and similarly we have \dfrac{1}{1-p} for tails. Let H and T denote the event of landing heads and tails on the initial flip respectively.

So (1) for matching flips, the expected number of flips is P(H) \times \dfrac{1}{p} + P(T) \times \dfrac{1}{1-p} = p \times \dfrac{1}{p} + (1-p) \times \dfrac{1}{1-p}=2.

Similarly, (2) for different flips, the expected number of flips is P(T) \times \dfrac{1}{p} + P(H) \times \dfrac{1}{1-p} = \dfrac{1-p}{p} + \dfrac{p}{1-p}=\dfrac{p^2+(1-p)^2}{p(1-p)}=\dfrac{2p^2-2p+1}{p(1-p)}

For the extreme cases, by L'Hopital's rule, we have \lim_{p\rightarrow0} \dfrac{2p^2-2p+1}{p(1-p)}=\lim_{p\rightarrow0} \dfrac{4p-2}{1-2p}=-2 and similarly, \lim_{p\rightarrow1} \dfrac{4p-2}{1-2p}=-2

So I realize I must be doing something wrongly because I'm getting negative expectation values in the final part. Any guidance on my working?

Thanks!
 
Physics news on Phys.org
ephedyn said:
Homework Statement
A biased coin lands heads with probability p and tails with probability 1-p.

(1) What is the expected number of flips, after the initial flip, to land a match to the initial flip?
(2) What is the expected number of flips, after the initial flip, to land a different side from the initial flip? Comment on the extreme values of p.

The attempt at a solution
Without loss of generality, assume we land heads on the initial flip. Let N_H be the number of flips required until we land heads again. Since N_H is a geometric random variable, with pmf f(x) = p(1-p)^{x-1}, then E[N_H]=\sum^{\infty}_{x=1} x \cdot f(x)= \sum^{\infty}_{x=1} px(1-p)^{x-1}=\dfrac{p}{1-(1-p)^2}=\dfrac{1}{p} and similarly we have \dfrac{1}{1-p} for tails. Let H and T denote the event of landing heads and tails on the initial flip respectively.

So (1) for matching flips, the expected number of flips is P(H) \times \dfrac{1}{p} + P(T) \times \dfrac{1}{1-p} = p \times \dfrac{1}{p} + (1-p) \times \dfrac{1}{1-p}=2.

Similarly, (2) for different flips, the expected number of flips is P(T) \times \dfrac{1}{p} + P(H) \times \dfrac{1}{1-p} = \dfrac{1-p}{p} + \dfrac{p}{1-p}=\dfrac{p^2+(1-p)^2}{p(1-p)}=\dfrac{2p^2-2p+1}{p(1-p)}

For the extreme cases, by L'Hopital's rule, we have \lim_{p\rightarrow0} \dfrac{2p^2-2p+1}{p(1-p)}=\lim_{p\rightarrow0} \dfrac{4p-2}{1-2p}=-2 and similarly, \lim_{p\rightarrow1} \dfrac{4p-2}{1-2p}=-2

So I realize I must be doing something wrongly because I'm getting negative expectation values in the final part. Any guidance on my working?

Thanks!

L'Hospital's rule does not apply, because you do not have something like 0/0 or ∞/∞.
 
Ah! You're right! So I have the expected number grow asymptotically in both cases. Makes sense intuitively, since it should become nearly impossible to land the other side on a flip. Thanks!

Does the rest of my approach make sense? I'm not too convinced about taking P[H] \times E[N_H] + P[T] \times E[N_T] because a property that allows me to do this seems to be missing from my memory.
 
Fwiw, your answer can be written as p/(1-p) + (1-p)/p.
ephedyn said:
Does the rest of my approach make sense? I'm not too convinced about taking P[H] \times E[N_H] + P[T] \times E[N_T] because a property that allows me to do this seems to be missing from my memory.
Yes, that's fine. You can justify it by considering the prob that it takes N tosses given the outcome of the initial toss, P[N|H], P[N|T]. P[N] = P[N|H]P[H]+ P[N|T]P[T]. E[N|H] = ƩNP[N|H], etc.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 15 ·
Replies
15
Views
7K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 41 ·
2
Replies
41
Views
9K
  • · Replies 2 ·
Replies
2
Views
3K