- #1

- 48

- 0

## Main Question or Discussion Point

Hello, From the following formula in the following theorem I am to deduce the mean delay of a customer arriving at a queue.

Theorem 1

Suppose that customers arrive at a single server queue according to a Poisson process mean rate q and that service times are exponentially distributed mean rate Q^-1.

Then provided that the traffic intensity x = q/Q < 1

the queue length distribution in equilibrium is

P_n = (1-x)x^n (x >=0)

where n is the number of customers in the system including any in service

Some notes on notation

Q = capacity of the queue i.e. the maximum mean rate of service

q = the mean arrival rate (customers/sec)

d= mean delay incurred by a customer in the queue which is equal to the difference between the time it takes to pass through the queue and the time it would normally take in the absense of other customers.

This is how I have so far approached the solution.

Suppose that delay is proportional to the number of customers in the system. Then d=kn for some proportionality constant n.

now i proceeded by considering a particular case namely settin g q=0.3 and Q=1

P(n=0) = 700/1000 (so 700 times out of 1000 there is no delay)

P(n=1) = 210/1000 (210 " " a delay of k seconds)

P(n=2) = 63/1000 (63 " " " 2k secs)

but then i got stuck, how do i find the mean delay for a customer arriving at this queue, i know i am to take an average of the above values, but not really sure how to proceed...??

Theorem 1

Suppose that customers arrive at a single server queue according to a Poisson process mean rate q and that service times are exponentially distributed mean rate Q^-1.

Then provided that the traffic intensity x = q/Q < 1

the queue length distribution in equilibrium is

P_n = (1-x)x^n (x >=0)

where n is the number of customers in the system including any in service

Some notes on notation

Q = capacity of the queue i.e. the maximum mean rate of service

q = the mean arrival rate (customers/sec)

d= mean delay incurred by a customer in the queue which is equal to the difference between the time it takes to pass through the queue and the time it would normally take in the absense of other customers.

This is how I have so far approached the solution.

Suppose that delay is proportional to the number of customers in the system. Then d=kn for some proportionality constant n.

now i proceeded by considering a particular case namely settin g q=0.3 and Q=1

P(n=0) = 700/1000 (so 700 times out of 1000 there is no delay)

P(n=1) = 210/1000 (210 " " a delay of k seconds)

P(n=2) = 63/1000 (63 " " " 2k secs)

but then i got stuck, how do i find the mean delay for a customer arriving at this queue, i know i am to take an average of the above values, but not really sure how to proceed...??