Expectation Value- Mean Time to Failure

In summary, the expected number of flips for (a) is 6, calculated using the Law of Total Expectation and the tree diagram D = H * D + T * (H * D + T). The equation involves replacing instances of D with the number of flips (NTT) and accounting for wasted turns. For (b), the expected number of flips is 3, calculated using the same method. In (c), the expected profit per game is $0.50, taking advantage of the opponent's untrained intuition and using unfair odds.
  • #1
QuietMind
42
2

Homework Statement


(a) Suppose we flip a fair coin until two Tails in a row come up. What is the expected number, NTT, of flips we perform? Hint: Let D be the tree diagram for this process. Explain why D = H · D + T · (H · D + T). Use the Law of Total Expectation

(b) Suppose we flip a fair coin until a Tail immediately followed by a Head come up. What is the expected number, NTH, of flips we perform?

(c) Suppose we now play a game: flip a fair coin until either TT or TH first occurs. You win if TT comes up first, lose if TH comes up first. Since TT takes 50% longer on average to turn up, your opponent agrees that he has the advantage. So you tell him you’re willing to play if you pay him $5 when he wins, but he merely pays you a 20% premium, that is, $6, when you win. If you do this, you’re sneakily taking advantage of your opponent’s untrained intuition, since you’ve gotten him to agree to unfair odds. What is your expected profit per game?

Homework Equations


Law of Total Expectation
##E[R] = \sum_i E[R | A_i] Pr{A_i }##

The Attempt at a Solution


I have the solutions, and am confused over part a.
Solution a):
NTT = 6.
From D and Total Expectation:
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

So, it's to my understanding that to answer these kinds of questions, the tree is D, while H/T are the probabilities of heads/tails, which are each 1/2. If the first toss is heads, that doesn't further you towards your goal at all, and basically leads you to a sub-tree that is identical to D. That is where the H*D comes from in the Hint. If you get a Tails on first toss, for your subsequent toss you have a the possibility of tossing a heads and starting all over (H*D), or a chance of getting a tails which ends the game (T) for a combination of T( H*D + T). So the hint's equation seems reasonable to me because you can combine the expressions for if you tossed heads first and if you tossed tails first to get : D= H*D + T*(H*D + T)

My issue is that I am not clear what to do with that equality. It seems like you want to replace the instances of D on the right hand side of the equation with [1+ NTT] because anytime D appears on the RHS, that means you have rolled a heads and are back to the beginning situation and you will need the NTT number of turns to end the game from this point. The "1+" before the NTT is to indicate that you wasted a turn because of you tossed a heads.

The LHS "D" is replaced with NTT to indicate the number of turns.

Is this reasonable so far? My issue with this problem is the addition of the 1 inside of the parenthesis (). The second "1" to appear in this equation, left to right. It is just after the left parenthesis "("
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

I do not understand where that 1 came from. It is not being added to an NTT term, so I don't think my prior logic applies. I don't see it any any other similar problems, but it appears to be critical to this problem.
 
Physics news on Phys.org
  • #2
Expanding D=HD+THD+TT, I get:
##N=\frac{1}{2} (N+1) + \frac{1}{4} (N+2) + \frac{1}{4} 2## which leads to the correct answer N=6. Maybe the first 1 is taking into account the "T" throw that leads to the whole bracket? I added it to both parts separately, but that does not change the result.

A very nice part (c).
 
  • #3
The reasoning behind your having N+1 in the first parenthesis vs N+2 in the second, is that because in the case of a tails and then a heads, that is 2 turns wasted? I did not take that into account. I treated that the same as the heads by itself case. That could contribute to my error.

Also, why is there a 2 at the end multiplying the TT?
 
  • #4
QuietMind said:
The reasoning behind your having N+1 in the first parenthesis vs N+2 in the second, is that because in the case of a tails and then a heads, that is 2 turns wasted?
Right.
QuietMind said:
Also, why is there a 2 at the end multiplying the TT?
TT is two flips as well.
 
  • #5
QuietMind said:

Homework Statement


(a) Suppose we flip a fair coin until two Tails in a row come up. What is the expected number, NTT, of flips we perform? Hint: Let D be the tree diagram for this process. Explain why D = H · D + T · (H · D + T). Use the Law of Total Expectation

(b) Suppose we flip a fair coin until a Tail immediately followed by a Head come up. What is the expected number, NTH, of flips we perform?

(c) Suppose we now play a game: flip a fair coin until either TT or TH first occurs. You win if TT comes up first, lose if TH comes up first. Since TT takes 50% longer on average to turn up, your opponent agrees that he has the advantage. So you tell him you’re willing to play if you pay him $5 when he wins, but he merely pays you a 20% premium, that is, $6, when you win. If you do this, you’re sneakily taking advantage of your opponent’s untrained intuition, since you’ve gotten him to agree to unfair odds. What is your expected profit per game?

Homework Equations


Law of Total Expectation
##E[R] = \sum_i E[R | A_i] Pr{A_i }##

The Attempt at a Solution


I have the solutions, and am confused over part a.
Solution a):
NTT = 6.
From D and Total Expectation:
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

So, it's to my understanding that to answer these kinds of questions, the tree is D, while H/T are the probabilities of heads/tails, which are each 1/2. If the first toss is heads, that doesn't further you towards your goal at all, and basically leads you to a sub-tree that is identical to D. That is where the H*D comes from in the Hint. If you get a Tails on first toss, for your subsequent toss you have a the possibility of tossing a heads and starting all over (H*D), or a chance of getting a tails which ends the game (T) for a combination of T( H*D + T). So the hint's equation seems reasonable to me because you can combine the expressions for if you tossed heads first and if you tossed tails first to get : D= H*D + T*(H*D + T)

My issue is that I am not clear what to do with that equality. It seems like you want to replace the instances of D on the right hand side of the equation with [1+ NTT] because anytime D appears on the RHS, that means you have rolled a heads and are back to the beginning situation and you will need the NTT number of turns to end the game from this point. The "1+" before the NTT is to indicate that you wasted a turn because of you tossed a heads.

The LHS "D" is replaced with NTT to indicate the number of turns.

Is this reasonable so far? My issue with this problem is the addition of the 1 inside of the parenthesis (). The second "1" to appear in this equation, left to right. It is just after the left parenthesis "("
## N_{TT} = \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} * (1 + \frac{1}{2} * [1+N_{TT} ] + \frac{1}{2} *1 ) ##

I do not understand where that 1 came from. It is not being added to an NTT term, so I don't think my prior logic applies. I don't see it any any other similar problems, but it appears to be critical to this problem.

You can view this as a Markov chain problem (even if you have not yet taken Markov chains). Here is how. For part (a) there are three "states" S (start, or start over), T (one tail on the way along to the 2-tail goal) and TT (game over, stop).

Let s = expected number of tosses to reach state TT, starting from S, and t = expected number of tosses to reach TT, starting from from T. When at state S we toss once and reach either state S again (if get heads) or state T (if get tails). Thus, s = 1 + (1/2)s + (1/2) t. When at state T we toss once and either reach state S again (and need s more tosses) or reach TT (and need 0 more tosses). Thus, t =1 +(1/2)s + (1/2)0 = 1 + (1/2)s. So, we have two equations s = 1 + (1/2)s + (1/2)t, t = 1 + (1/2)s, and solving them gives s = 6, t = 4.

You can write similar equations for part (b), except that when at state T we next either reach state TH (stop) or reach state T again. Thus, for part (b) the equations are s = 1 + (1/2)s + (1/2)t, t = 1 + (1/2)t.
 
  • Like
Likes QuietMind
  • #6
Could you elaborate a bit more on this?
mfb said:
TT is two flips as well.

When would you decide to multiply by the number of flips? N+2 I understand because you are adding 2 to account for the flips that already occurred, but I don't see where multiplication comes into this.
 
  • #7
QuietMind said:
When would you decide to multiply by the number of flips?
Everywhere. N+1 and N+2 are the expected number of flips as well. TT has an expected 2 flips (it always has exactly 2).
 
  • Like
Likes QuietMind
  • #8
This is a very similar problem that I'd like to check my work on to see if I understand the Markov Chain approach. If it's inappropriate to follow up with a question like this, I apologize, but I think the question is very similar and starting a new topic would be redundant.

Homework Statement

The most common way to build a local area network nowadays is an Ethernet: basically, a single wire to which we can attach any number of computers. The communication protocol is quite simple: anyone who wants to talk to another computer broadcasts a message on the wire, hoping the other computer will hear it. The problem is that if more than one computer broadcasts at once, a collision occurs that garbles all messages we are trying to send. The transmission only works if exactly one machine broadcasts at one time.

Let’s consider a simple example. There are n machines connected by an ethernet, and each wants to broadcast a message. We can imagine time divided into a sequence of intervals, each of which is long enough for one message broadcast.

Suppose each computer flips an independent coin, and decides to broadcast with probability p.

(a) What is the probability that exactly one message gets through in a given interval? Hint: Consider the event Ai that machine i transmits but no other does.

(b) What is the expected time it takes for machine i to get a message through?

The Attempt at a Solution



(a) For a particular machine i, the probability that it decides to send is p. The probability that no other machine decides to send is
##(1-p)^{n-1}##
so for a particular machine i, the probability it's message is clearly sent is
##p(1-p)^{n-1}##
There are n machines total, each with the same probability of clearly sending a message, so total probability is
##np(1-p)^{n-1}##

(b) Let each unit of time be called a "turn", where each computer flips a coin and decides whether to send. It's asking about a particular machine i, which has probability ##p(1-p)^{n-1}## of sending a message through clearly.

Expected # turns = ##1 + p(1-p)^{n-1}(0) + (1-p(1-p)^{n-1}) ##*Expected#turns
the first term, 1, is to account for the current coin flip. The middle term is if the message is clearly sent, in which case no further turns are needed so I multiplied by 0. The last term is if the message is not sent or is garbled, which means another Expected#turns will be required.
Solving for this gives me
Expected # turns = ##\frac{1}{p(1-p)^{n-1}}##

Does that seem right?
 
  • #9
QuietMind said:
This is a very similar problem that I'd like to check my work on to see if I understand the Markov Chain approach. If it's inappropriate to follow up with a question like this, I apologize, but I think the question is very similar and starting a new topic would be redundant.

Homework Statement

The most common way to build a local area network nowadays is an Ethernet: basically, a single wire to which we can attach any number of computers. The communication protocol is quite simple: anyone who wants to talk to another computer broadcasts a message on the wire, hoping the other computer will hear it. The problem is that if more than one computer broadcasts at once, a collision occurs that garbles all messages we are trying to send. The transmission only works if exactly one machine broadcasts at one time.

Let’s consider a simple example. There are n machines connected by an ethernet, and each wants to broadcast a message. We can imagine time divided into a sequence of intervals, each of which is long enough for one message broadcast.

Suppose each computer flips an independent coin, and decides to broadcast with probability p.

(a) What is the probability that exactly one message gets through in a given interval? Hint: Consider the event Ai that machine i transmits but no other does.

(b) What is the expected time it takes for machine i to get a message through?

The Attempt at a Solution



(a) For a particular machine i, the probability that it decides to send is p. The probability that no other machine decides to send is
##(1-p)^{n-1}##
so for a particular machine i, the probability it's message is clearly sent is
##p(1-p)^{n-1}##
There are n machines total, each with the same probability of clearly sending a message, so total probability is
##np(1-p)^{n-1}##

(b) Let each unit of time be called a "turn", where each computer flips a coin and decides whether to send. It's asking about a particular machine i, which has probability ##p(1-p)^{n-1}## of sending a message through clearly.

Expected # turns = ##1 + p(1-p)^{n-1}(0) + (1-p(1-p)^{n-1}) ##*Expected#turns
the first term, 1, is to account for the current coin flip. The middle term is if the message is clearly sent, in which case no further turns are needed so I multiplied by 0. The last term is if the message is not sent or is garbled, which means another Expected#turns will be required.
Solving for this gives me
Expected # turns = ##\frac{1}{p(1-p)^{n-1}}##

Does that seem right?

Yes. You are getting the mean of a geometric random variable with success parameter ##P = p(1-p)^{n-1}##.

For future reference: it is considered against PF standards (rules?) to post a new question under an old thread. It would be much better to start a new thread.
 

1. What is the concept of expectation value in relation to mean time to failure?

The expectation value, also known as the expected value or mean, is a statistical measure that represents the average value of a random variable. In the context of mean time to failure, the expectation value represents the average amount of time until a system or component fails.

2. How is the expectation value calculated for mean time to failure?

The expectation value for mean time to failure is calculated by multiplying the probability of failure by the time to failure for each possible outcome, and then summing these values. Mathematically, it can be represented as E(T) = ∑(t * P(t)), where E(T) is the expectation value, t is the time to failure, and P(t) is the probability of failure at time t.

3. What is the significance of the expectation value in determining system reliability?

The expectation value is an important measure in determining system reliability because it represents the average time until a failure occurs. A lower expectation value indicates a more reliable system, as it is less likely to fail in a given period of time.

4. Can the expectation value for mean time to failure be used to predict the exact time of failure?

No, the expectation value cannot predict the exact time of failure. It represents the average time until failure, but the actual failure time may vary due to random factors or unexpected events. It is a statistical measure that provides an estimate of the system's reliability.

5. How can the expectation value for mean time to failure be improved for a system?

The expectation value for mean time to failure can be improved by increasing the reliability of individual components within the system, reducing the probability of failure for each component. Additionally, implementing regular maintenance and monitoring can help detect and address potential issues before they lead to failure, thus increasing the overall reliability of the system.

Similar threads

  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Programming and Computer Science
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
324
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Precalculus Mathematics Homework Help
Replies
4
Views
1K
  • Precalculus Mathematics Homework Help
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
57
Views
2K
Back
Top