# Expectation Value- Mean Time to Failure

## Homework Statement

(a) Suppose we flip a fair coin until two Tails in a row come up. What is the expected number, NTT, of flips we perform? Hint: Let D be the tree diagram for this process. Explain why D = H · D + T · (H · D + T). Use the Law of Total Expectation

(b) Suppose we flip a fair coin until a Tail immediately followed by a Head come up. What is the expected number, NTH, of flips we perform?

(c) Suppose we now play a game: flip a fair coin until either TT or TH first occurs. You win if TT comes up first, lose if TH comes up first. Since TT takes 50% longer on average to turn up, your opponent agrees that he has the advantage. So you tell him you’re willing to play if you pay him $5 when he wins, but he merely pays you a 20% premium, that is,$6, when you win. If you do this, you’re sneakily taking advantage of your opponent’s untrained intuition, since you’ve gotten him to agree to unfair odds. What is your expected profit per game?

## Homework Equations

Law of Total Expectation
##E[R] = \sum_i E[R | A_i] Pr{A_i }##

## The Attempt at a Solution

I have the solutions, and am confused over part a.
Solution a):
NTT = 6.
From D and Total Expectation:
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

So, it's to my understanding that to answer these kinds of questions, the tree is D, while H/T are the probabilities of heads/tails, which are each 1/2. If the first toss is heads, that doesn't further you towards your goal at all, and basically leads you to a sub-tree that is identical to D. That is where the H*D comes from in the Hint. If you get a Tails on first toss, for your subsequent toss you have a the possibility of tossing a heads and starting all over (H*D), or a chance of getting a tails which ends the game (T) for a combination of T( H*D + T). So the hint's equation seems reasonable to me because you can combine the expressions for if you tossed heads first and if you tossed tails first to get : D= H*D + T*(H*D + T)

My issue is that I am not clear what to do with that equality. It seems like you want to replace the instances of D on the right hand side of the equation with [1+ NTT] because anytime D appears on the RHS, that means you have rolled a heads and are back to the beginning situation and you will need the NTT number of turns to end the game from this point. The "1+" before the NTT is to indicate that you wasted a turn because of you tossed a heads.

The LHS "D" is replaced with NTT to indicate the number of turns.

Is this reasonable so far? My issue with this problem is the addition of the 1 inside of the parenthesis (). The second "1" to appear in this equation, left to right. It is just after the left parenthesis "("
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

I do not understand where that 1 came from. It is not being added to an NTT term, so I don't think my prior logic applies. I don't see it any any other similar problems, but it appears to be critical to this problem.

mfb
Mentor
Expanding D=HD+THD+TT, I get:
##N=\frac{1}{2} (N+1) + \frac{1}{4} (N+2) + \frac{1}{4} 2## which leads to the correct answer N=6. Maybe the first 1 is taking into account the "T" throw that leads to the whole bracket? I added it to both parts separately, but that does not change the result.

A very nice part (c).

The reasoning behind your having N+1 in the first parenthesis vs N+2 in the second, is that because in the case of a tails and then a heads, that is 2 turns wasted? I did not take that into account. I treated that the same as the heads by itself case. That could contribute to my error.

Also, why is there a 2 at the end multiplying the TT?

mfb
Mentor
The reasoning behind your having N+1 in the first parenthesis vs N+2 in the second, is that because in the case of a tails and then a heads, that is 2 turns wasted?
Right.
Also, why is there a 2 at the end multiplying the TT?
TT is two flips as well.

Ray Vickson
Homework Helper
Dearly Missed

## Homework Statement

(a) Suppose we flip a fair coin until two Tails in a row come up. What is the expected number, NTT, of flips we perform? Hint: Let D be the tree diagram for this process. Explain why D = H · D + T · (H · D + T). Use the Law of Total Expectation

(b) Suppose we flip a fair coin until a Tail immediately followed by a Head come up. What is the expected number, NTH, of flips we perform?

(c) Suppose we now play a game: flip a fair coin until either TT or TH first occurs. You win if TT comes up first, lose if TH comes up first. Since TT takes 50% longer on average to turn up, your opponent agrees that he has the advantage. So you tell him you’re willing to play if you pay him $5 when he wins, but he merely pays you a 20% premium, that is,$6, when you win. If you do this, you’re sneakily taking advantage of your opponent’s untrained intuition, since you’ve gotten him to agree to unfair odds. What is your expected profit per game?

## Homework Equations

Law of Total Expectation
##E[R] = \sum_i E[R | A_i] Pr{A_i }##

## The Attempt at a Solution

I have the solutions, and am confused over part a.
Solution a):
NTT = 6.
From D and Total Expectation:
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

So, it's to my understanding that to answer these kinds of questions, the tree is D, while H/T are the probabilities of heads/tails, which are each 1/2. If the first toss is heads, that doesn't further you towards your goal at all, and basically leads you to a sub-tree that is identical to D. That is where the H*D comes from in the Hint. If you get a Tails on first toss, for your subsequent toss you have a the possibility of tossing a heads and starting all over (H*D), or a chance of getting a tails which ends the game (T) for a combination of T( H*D + T). So the hint's equation seems reasonable to me because you can combine the expressions for if you tossed heads first and if you tossed tails first to get : D= H*D + T*(H*D + T)

My issue is that I am not clear what to do with that equality. It seems like you want to replace the instances of D on the right hand side of the equation with [1+ NTT] because anytime D appears on the RHS, that means you have rolled a heads and are back to the beginning situation and you will need the NTT number of turns to end the game from this point. The "1+" before the NTT is to indicate that you wasted a turn because of you tossed a heads.

The LHS "D" is replaced with NTT to indicate the number of turns.

Is this reasonable so far? My issue with this problem is the addition of the 1 inside of the parenthesis (). The second "1" to appear in this equation, left to right. It is just after the left parenthesis "("
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

I do not understand where that 1 came from. It is not being added to an NTT term, so I don't think my prior logic applies. I don't see it any any other similar problems, but it appears to be critical to this problem.

You can view this as a Markov chain problem (even if you have not yet taken Markov chains). Here is how. For part (a) there are three "states" S (start, or start over), T (one tail on the way along to the 2-tail goal) and TT (game over, stop).

Let s = expected number of tosses to reach state TT, starting from S, and t = expected number of tosses to reach TT, starting from from T. When at state S we toss once and reach either state S again (if get heads) or state T (if get tails). Thus, s = 1 + (1/2)s + (1/2) t. When at state T we toss once and either reach state S again (and need s more tosses) or reach TT (and need 0 more tosses). Thus, t =1 +(1/2)s + (1/2)0 = 1 + (1/2)s. So, we have two equations s = 1 + (1/2)s + (1/2)t, t = 1 + (1/2)s, and solving them gives s = 6, t = 4.

You can write similar equations for part (b), except that when at state T we next either reach state TH (stop) or reach state T again. Thus, for part (b) the equations are s = 1 + (1/2)s + (1/2)t, t = 1 + (1/2)t.

QuietMind
Could you elaborate a bit more on this?
TT is two flips as well.

When would you decide to multiply by the number of flips? N+2 I understand because you are adding 2 to account for the flips that already occurred, but I don't see where multiplication comes into this.

mfb
Mentor
When would you decide to multiply by the number of flips?
Everywhere. N+1 and N+2 are the expected number of flips as well. TT has an expected 2 flips (it always has exactly 2).

QuietMind
This is a very similar problem that I'd like to check my work on to see if I understand the Markov Chain approach. If it's inappropriate to follow up with a question like this, I apologize, but I think the question is very similar and starting a new topic would be redundant.

## Homework Statement

The most common way to build a local area network nowadays is an Ethernet: basically, a single wire to which we can attach any number of computers. The communication protocol is quite simple: anyone who wants to talk to another computer broadcasts a message on the wire, hoping the other computer will hear it. The problem is that if more than one computer broadcasts at once, a collision occurs that garbles all messages we are trying to send. The transmission only works if exactly one machine broadcasts at one time.

Let’s consider a simple example. There are n machines connected by an ethernet, and each wants to broadcast a message. We can imagine time divided into a sequence of intervals, each of which is long enough for one message broadcast.

Suppose each computer flips an independent coin, and decides to broadcast with probability p.

(a) What is the probability that exactly one message gets through in a given interval? Hint: Consider the event Ai that machine i transmits but no other does.

(b) What is the expected time it takes for machine i to get a message through?

## The Attempt at a Solution

(a) For a particular machine i, the probability that it decides to send is p. The probability that no other machine decides to send is
##(1-p)^{n-1}##
so for a particular machine i, the probability it's message is clearly sent is
##p(1-p)^{n-1}##
There are n machines total, each with the same probability of clearly sending a message, so total probability is
##np(1-p)^{n-1}##

(b) Let each unit of time be called a "turn", where each computer flips a coin and decides whether to send. It's asking about a particular machine i, which has probability ##p(1-p)^{n-1}## of sending a message through clearly.

Expected # turns = ##1 + p(1-p)^{n-1}(0) + (1-p(1-p)^{n-1}) ##*Expected#turns
the first term, 1, is to account for the current coin flip. The middle term is if the message is clearly sent, in which case no further turns are needed so I multiplied by 0. The last term is if the message is not sent or is garbled, which means another Expected#turns will be required.
Solving for this gives me
Expected # turns = ##\frac{1}{p(1-p)^{n-1}}##

Does that seem right?

Ray Vickson
Homework Helper
Dearly Missed
This is a very similar problem that I'd like to check my work on to see if I understand the Markov Chain approach. If it's inappropriate to follow up with a question like this, I apologize, but I think the question is very similar and starting a new topic would be redundant.

## Homework Statement

The most common way to build a local area network nowadays is an Ethernet: basically, a single wire to which we can attach any number of computers. The communication protocol is quite simple: anyone who wants to talk to another computer broadcasts a message on the wire, hoping the other computer will hear it. The problem is that if more than one computer broadcasts at once, a collision occurs that garbles all messages we are trying to send. The transmission only works if exactly one machine broadcasts at one time.

Let’s consider a simple example. There are n machines connected by an ethernet, and each wants to broadcast a message. We can imagine time divided into a sequence of intervals, each of which is long enough for one message broadcast.

Suppose each computer flips an independent coin, and decides to broadcast with probability p.

(a) What is the probability that exactly one message gets through in a given interval? Hint: Consider the event Ai that machine i transmits but no other does.

(b) What is the expected time it takes for machine i to get a message through?

## The Attempt at a Solution

(a) For a particular machine i, the probability that it decides to send is p. The probability that no other machine decides to send is
##(1-p)^{n-1}##
so for a particular machine i, the probability it's message is clearly sent is
##p(1-p)^{n-1}##
There are n machines total, each with the same probability of clearly sending a message, so total probability is
##np(1-p)^{n-1}##

(b) Let each unit of time be called a "turn", where each computer flips a coin and decides whether to send. It's asking about a particular machine i, which has probability ##p(1-p)^{n-1}## of sending a message through clearly.

Expected # turns = ##1 + p(1-p)^{n-1}(0) + (1-p(1-p)^{n-1}) ##*Expected#turns
the first term, 1, is to account for the current coin flip. The middle term is if the message is clearly sent, in which case no further turns are needed so I multiplied by 0. The last term is if the message is not sent or is garbled, which means another Expected#turns will be required.
Solving for this gives me
Expected # turns = ##\frac{1}{p(1-p)^{n-1}}##

Does that seem right?

Yes. You are getting the mean of a geometric random variable with success parameter ##P = p(1-p)^{n-1}##.

For future reference: it is considered against PF standards (rules?) to post a new question under an old thread. It would be much better to start a new thread.