Is Discarding the Higher Random Number in a Stochastic Queue Simulation Correct?

Click For Summary

Discussion Overview

The discussion revolves around the correctness of an algorithm for simulating a stochastic queue using a birth-death process, specifically focusing on the treatment of random numbers generated from exponential distributions representing time until the next birth and death. Participants explore the implications of discarding the higher random number and how it may affect the accuracy of the simulation.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant describes their algorithm for simulating a birth-death process, emphasizing the use of two random numbers from exponential distributions to determine the next event.
  • Another participant suggests that discarding the longer interval time could distort the distributions, advocating for generating a new interval time only for the event that just occurred.
  • A later reply raises concerns about the implications of using the higher number for the next state, noting that the rate parameters differ across states and could lead to inaccuracies.
  • Further discussion highlights the need for births and deaths to be generated independently, with the suggestion that each birth should immediately assign a time of death to maintain the correct distribution.
  • One participant proposes simulating the process as two Poisson processes with a fixed time interval as a potentially better approach.

Areas of Agreement / Disagreement

Participants express differing views on the validity of discarding the higher random number, with some arguing against it while others acknowledge the complexity introduced by varying rate parameters. The discussion remains unresolved regarding the best approach to accurately simulate the queue process.

Contextual Notes

Participants note that the dependence on different rate parameters for births and deaths complicates the simulation, and there are concerns about maintaining the desired exponential distribution of events. The limitations of the current algorithm in accurately reflecting the dynamics of the queue are acknowledged.

maka89
Messages
66
Reaction score
4
Hello guys!
I am learning a bit on queue theory in one of my courses, and decided to try and do some simulations even though it is not mandatory. (Curriculum only covers the steady state, where it can be treated mathematically with ease).

Im looking at a birth-death process, where the time til next birth and next death are both exponentially distributed.

My algorithm:
---------------------------------------------------------------------------------------------------------------------------------------------
Intitialization: Set state to n=0(Number of people in queue and getting service) , set current time t=0, choose end simulation time t_{max}

Loop until the time t has reached t_{max}:
- Get two random numbers from exponential distributions. One representing time until next death. One until next birth. The two numbers have different rate parameters.
- Choose the lowest of these two numbers as \Delta t.
- Set the state to n+1 or n-1, depending on which of the numbers were chosen.
- Update the time t \rightarrow t +\Delta t.
- Save the state and time period.
End loop.
----------------------------------------------------------------------------------------------------------------------------------------------

This process is repeated many times, and used to create probability densities for each of the states.

My question:
My algorithm simply discards the higher of the two randomly generated numbers. The state is then updated and two new random numbers are generated. Is this correct, or should the value of the second number be taken into account somehow after changing to the next state? The rate parameters of the probability distributions are not generally the same at each state.
 
Physics news on Phys.org
It seems like that might distort the distributions so that larger interval times are more likely to be rejected. There is no reason to throw away the longer interval time. Generate only a new interval time for the one that just happened (birth or death) and compare it with the other remaining time to see which happens first. Then there would be no question about the distributions being distorted.

PS. With exponential interval times, your current algorithm might be correct, but that is a special case. Unless you can verify it is correct, avoid throwing away the longer interval time.
 
Okay, thanks.

I do see a problem with this though: The rate parameters in the exponential distributions are different for the different states, which isn't taken into account if u use the time interval generated for the previous state. Say: U get two random numbers: 1 min (birth), 40 mins (death). In the next state, it doesn't seem reasonable to use the higher number for death since the rate parameter for death in this state might be much higher(say 1 / 1sec)

With my method there is the following problem though: If a person is say 0.5 seconds away from birth as another just "died", this person simply disappears. Maybe this is ok with exponential distributions? I don't know. Or maybe the problem is a bit harder than i imagined xD
 
maka89 said:
maybe the problem is a bit harder than i imagined
I agree. The next time of a birth should only be generated when a birth occurs. So the series of births is generated independent of the deaths. So the births will have an exponential distribution. If you generate an exceptionally long time for the next birth event, there may be several deaths before that birth occurs. If you are simulating a situation with a limited pool of objects, you will need to be sure that there is an object available to die. The best way to simulate this is an object-oriented approach where each object generated by a birth is immediately assigned a time of death. The appropriate parameters should be used for each type of event that is generated. The new of birth and death are then scheduled on the time line. (Actually, there is no reason to schedule an object's death till it's time of birth is reached.) But the "life expectancy" is not what you are given for modeling the time of death, so the distribution of deaths will not have the desired exponential distribution.

I think that the closest you can reasonably come to retaining exponential death distributions is to generate the next birth every time a birth occurs, and generate a new death every time a death occurs -- each using the correct parameters. Then only record a death if there is a live object. But if it hits that constraint, the final distribution of deaths may not have the same exponential distribution that you started with.
 
Last edited:
A better way to simulate this would be as two poisson processes with a single fixed time interval.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 28 ·
Replies
28
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
5K
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K