- #1

- 7

- 0

I'm programming a process that runs in iterations with a probability of stopping at every iteration. As it gets closer to the desired run-time, this probability should increase so that if the process is run many times over, the runtimes will form a normal distribution, with most of them having the desired/mean duration (number of iterations for which it ran) and very few of them either stopping at start or running for twice as long (at the close-to-zero edges of the bell-curve).

How can I get the probability for stopping at every iteration as the duration increases and the process iterates along the curve? I think this might be a more mathematical interpretation of my problem:

How can I get the probability for stopping at every iteration as the duration increases and the process iterates along the curve? I think this might be a more mathematical interpretation of my problem:

I tried finding the probability that the process does NOT stop at a certain iteration by taking the cumulative distribution function for the duration and then dividing it by the probability of the previous iterations in order to get the probability for that one event, but the values I got were skewed to the left, as in most cases having a duration under 20 (with 20 being my desired mean)Stephen Tashi said:Let P(S_i ) be the conditional probability that the process stops at the end of step i given that it has reached step i. Let X be the random variable whose value is the number of steps that occur before the process stops. You want a formula for assigning the values of P(S_i), i = 1,2,... so the distribution of X will be approximately normal. You also seem to have specific mean and standard deviation that you want to attain.

Last edited: