Expected Number of Flips for Biased Coin | Homework Question

  • Thread starter Thread starter ephedyn
  • Start date Start date
ephedyn
Messages
169
Reaction score
1
Homework Statement
A biased coin lands heads with probability p and tails with probability 1-p.

(1) What is the expected number of flips, after the initial flip, to land a match to the initial flip?
(2) What is the expected number of flips, after the initial flip, to land a different side from the initial flip? Comment on the extreme values of p.

The attempt at a solution
Without loss of generality, assume we land heads on the initial flip. Let N_H be the number of flips required until we land heads again. Since N_H is a geometric random variable, with pmf f(x) = p(1-p)^{x-1}, then E[N_H]=\sum^{\infty}_{x=1} x \cdot f(x)= \sum^{\infty}_{x=1} px(1-p)^{x-1}=\dfrac{p}{1-(1-p)^2}=\dfrac{1}{p} and similarly we have \dfrac{1}{1-p} for tails. Let H and T denote the event of landing heads and tails on the initial flip respectively.

So (1) for matching flips, the expected number of flips is P(H) \times \dfrac{1}{p} + P(T) \times \dfrac{1}{1-p} = p \times \dfrac{1}{p} + (1-p) \times \dfrac{1}{1-p}=2.

Similarly, (2) for different flips, the expected number of flips is P(T) \times \dfrac{1}{p} + P(H) \times \dfrac{1}{1-p} = \dfrac{1-p}{p} + \dfrac{p}{1-p}=\dfrac{p^2+(1-p)^2}{p(1-p)}=\dfrac{2p^2-2p+1}{p(1-p)}

For the extreme cases, by L'Hopital's rule, we have \lim_{p\rightarrow0} \dfrac{2p^2-2p+1}{p(1-p)}=\lim_{p\rightarrow0} \dfrac{4p-2}{1-2p}=-2 and similarly, \lim_{p\rightarrow1} \dfrac{4p-2}{1-2p}=-2

So I realize I must be doing something wrongly because I'm getting negative expectation values in the final part. Any guidance on my working?

Thanks!
 
Physics news on Phys.org
ephedyn said:
Homework Statement
A biased coin lands heads with probability p and tails with probability 1-p.

(1) What is the expected number of flips, after the initial flip, to land a match to the initial flip?
(2) What is the expected number of flips, after the initial flip, to land a different side from the initial flip? Comment on the extreme values of p.

The attempt at a solution
Without loss of generality, assume we land heads on the initial flip. Let N_H be the number of flips required until we land heads again. Since N_H is a geometric random variable, with pmf f(x) = p(1-p)^{x-1}, then E[N_H]=\sum^{\infty}_{x=1} x \cdot f(x)= \sum^{\infty}_{x=1} px(1-p)^{x-1}=\dfrac{p}{1-(1-p)^2}=\dfrac{1}{p} and similarly we have \dfrac{1}{1-p} for tails. Let H and T denote the event of landing heads and tails on the initial flip respectively.

So (1) for matching flips, the expected number of flips is P(H) \times \dfrac{1}{p} + P(T) \times \dfrac{1}{1-p} = p \times \dfrac{1}{p} + (1-p) \times \dfrac{1}{1-p}=2.

Similarly, (2) for different flips, the expected number of flips is P(T) \times \dfrac{1}{p} + P(H) \times \dfrac{1}{1-p} = \dfrac{1-p}{p} + \dfrac{p}{1-p}=\dfrac{p^2+(1-p)^2}{p(1-p)}=\dfrac{2p^2-2p+1}{p(1-p)}

For the extreme cases, by L'Hopital's rule, we have \lim_{p\rightarrow0} \dfrac{2p^2-2p+1}{p(1-p)}=\lim_{p\rightarrow0} \dfrac{4p-2}{1-2p}=-2 and similarly, \lim_{p\rightarrow1} \dfrac{4p-2}{1-2p}=-2

So I realize I must be doing something wrongly because I'm getting negative expectation values in the final part. Any guidance on my working?

Thanks!

L'Hospital's rule does not apply, because you do not have something like 0/0 or ∞/∞.
 
Ah! You're right! So I have the expected number grow asymptotically in both cases. Makes sense intuitively, since it should become nearly impossible to land the other side on a flip. Thanks!

Does the rest of my approach make sense? I'm not too convinced about taking P[H] \times E[N_H] + P[T] \times E[N_T] because a property that allows me to do this seems to be missing from my memory.
 
Fwiw, your answer can be written as p/(1-p) + (1-p)/p.
ephedyn said:
Does the rest of my approach make sense? I'm not too convinced about taking P[H] \times E[N_H] + P[T] \times E[N_T] because a property that allows me to do this seems to be missing from my memory.
Yes, that's fine. You can justify it by considering the prob that it takes N tosses given the outcome of the initial toss, P[N|H], P[N|T]. P[N] = P[N|H]P[H]+ P[N|T]P[T]. E[N|H] = ƩNP[N|H], etc.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Replies
2
Views
1K
Replies
10
Views
3K
Replies
7
Views
1K
Replies
21
Views
2K
Replies
41
Views
7K
Back
Top