Poisson inflow, constant outflow question

  • Context: Graduate 
  • Thread starter Thread starter pmsrw3
  • Start date Start date
  • Tags Tags
    Constant Poisson
Click For Summary

Discussion Overview

The discussion revolves around a problem involving a tube where raindrops fall into it at a Poisson rate and flow out at a constant rate. Participants explore the mean amount of water in the tube at steady-state, relating it to concepts in queueing theory and stochastic processes.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant describes a Poisson process for raindrop inflow and proposes a formula for the mean volume in the tube, = i/(2(1 - i)), but seeks a proof or rationale for this result.
  • Another participant draws parallels to queueing theory, questioning the scheduling of output drops and suggesting that the problem resembles a single-server model with Poisson arrivals.
  • A different participant presents a mathematical formulation involving steady-state equations and probabilities related to the number of drops in the tube.
  • There is a discussion about the implications of continuous versus discrete time models, with one participant suggesting a Markov chain approach to analyze the problem.
  • Some participants express confusion over specific mathematical steps and seek clarification on deriving certain relationships, such as the steady-state probabilities.
  • One participant acknowledges a typo in their earlier formula but maintains that the end result remains valid.
  • Another participant mentions their findings from queueing theory literature, indicating a method to derive the mean volume and confirming the initial numerical results.

Areas of Agreement / Disagreement

Participants express various viewpoints and approaches to the problem, with no consensus reached on the best method or the validity of specific claims. Disagreements exist regarding the application of queueing theory and the interpretation of results.

Contextual Notes

Some participants highlight limitations in their understanding of the mathematical derivations and the need for corrections in their models. The discussion reflects a mix of assumptions and conditions that are not fully resolved.

Who May Find This Useful

Readers interested in queueing theory, stochastic processes, and mathematical modeling in physics or engineering contexts may find this discussion relevant.

pmsrw3
Messages
459
Reaction score
1
The following problem came up in my work.

You have a tube, open at the top. Raindrops fall into the mouth of the tube at a mean rate i per second, 0 <= i < 1, in a Poisson process. There's a hole in the bottom of the tube. When there's water in the tube, it flows out at a constant rate of 1 drop/sec. (Ignore that this is not physically plausible. You can imagine that there's a peristaltic pump hooked up if it makes you feel better.) What is the mean amount of water in the tube at steady-state?

I came up with a complicated solution to this (which I will describe, if anyone wishes) in the form of a sum. But when I evaluated my solution numerically, I found that, within round-off error, computed values equaled the simple result <v> = i/(2(1 - i)). This is also plausible and has correct limiting behavior at 0 and 1. Since then I've been trying hard without success to come up with a proof or even fairly good rationale for that formula.

Thanks in advance for any ideas.
 
Physics news on Phys.org
The problem resembles the queueing theory model where there is a single server, poisson arrivals and a constant service time. However, the result for "average number of customers in the queue" ( for example p8 of this PDF: http://www.google.com/url?sa=t&sour...LHloQ4&usg=AFQjCNFWSUKGueI5N_p_E4GxhKRThY14HQ ) doesn't look like your answer.

Are the output drops scheduled at even increments? For example, if the output drop rate is 1 per second, suppose there is no water in the tube at time t = 0 and a drop arrives at time t = 3/4. Does the output drop happen at t = 1 or t = (1 + 3/4)?
 
Writing it as [itex]N_{t+1}=\max(0,N_t+K_t-1)[/itex] where [itex]K_t[/itex] is Poisson, we have the steady state equations for [itex]q_n=P[N=n][/itex]
[tex]q_{n+1}=\frac{1}{p_0}-\sum_{k=0}^n p_{n+1-k}q_k - \delta_n p_0q_0[/tex]
where the [itex]p_j[/itex] are the Poisson probabilities. Taking the expectation of N leads to
[tex]q_0=(1-\lambda)e^{\lambda}[/tex]
and taking the expectation of N^2 leads to
[tex]E[N]=\frac{\lambda^2}{2(1-\lambda)}[/tex]
 
Last edited:
Stephen Tashi said:
The problem resembles the queueing theory model where there is a single server, poisson arrivals and a constant service time. However, the result for "average number of customers in the queue" ( for example p8 of this PDF: http://www.google.com/url?sa=t&sour...LHloQ4&usg=AFQjCNFWSUKGueI5N_p_E4GxhKRThY14HQ ) doesn't look like your answer.

Are the output drops scheduled at even increments? For example, if the output drop rate is 1 per second, suppose there is no water in the tube at time t = 0 and a drop arrives at time t = 3/4. Does the output drop happen at t = 1 or t = (1 + 3/4)?
No, it's in continuous time -- the drops arrive when they arrive, and the output flow is continuous. However, you can derive a discrete time Markov chain by having a clock that ticks when the first drop falls in (assume we start empty), at 1s intervals thereafter until the tube drains, then waits until the next drop for the next tick. So, for instance if drops arrived at 1.1, 1.3, 1.5, 4.6, 5.9, 6.4, ... ticks would occur at 1.1, 2.1, 3.1, 4.1, 4.6, 5.6, 5.9, 6.9, 7.9, ... At each tick the number of drops will be an integer (1, 2, 1, 0, 1, 0, 1, 1, 0, ...). You can solve for the steady-state occupancies of this chain. The mean will not be the mean of the continuous time system, because the 0's will be underweighted. (In the discrete time case, the 0 state always lasts 1 tick, whereas in the original problem it lasts on average 1/i sec.) But it is straightforward to correct for that. That's how I solved the system. (I also did a little Monte Carlo to check my results.)

I'll have to look into queueing theory -- it's probably what I need, or very close. Unfortunately, that PDF you link to just gives formulas without derivations.

bpet said:
Taking the expectation of N leads to
[tex]q_0=(1-\lambda)e^{\lambda}[/tex]
Could you explain this step, please? I mean, I know [itex]E[N] = \sum{nq_n}[/itex], but how do you get from there to [itex]q_0=(1-\lambda)e^{\lambda}[/itex]?
 
pmsrw3 said:
Could you explain this step, please? I mean, I know [itex]E[N] = \sum{nq_n}[/itex], but how do you get from there to [itex]q_0=(1-\lambda)e^{\lambda}[/itex]?

For that you just apply [itex]E[N]=\sum_{n=0}^{\infty}(n+1)q_{n+1}[/itex] to the recurrence relation and after some algebra the E[N] terms cancel.

In continuous time where a new drop is added with probability [itex]\lambda dt[/itex], split the state N=0 into {N=0, no drops falling} with prob q and {N=0, one drop falling} with prob q_0, and the steady state relations give you [itex]q_0=\lambda q, q_1=\lambda q_0, q_{n+2}=(1+\lambda)q_{n+1}-\lambda q_n[/itex]. Applying E[N] to the recurrence relation gives [itex]q=1-\lambda[/itex] and E[N^2] gives
[tex]E[N]=\frac{\lambda^2}{1-\lambda}[/tex].
The notes in Stephen Tashi's link imply the formula [itex]E[M]=(1-q)+E[N] = \lambda+\tfrac{\lambda^2}{1-\lambda}[/itex] which includes the drop that is falling.
 
bpet said:
Writing it as [itex]N_{t+1}=\max(0,N_t+K_t-1)[/itex] where [itex]K_t[/itex] is Poisson, we have the steady state equations for [itex]q_n=P[N=n][/itex]
[tex]q_{n+1}=\frac{1}{p_0}-\sum_{k=0}^n p_{n+1-k}q_k - \delta_n p_0q_0[/tex]
where the [itex]p_j[/itex] are the Poisson probabilities.
I am not getting this for the SS conditions. Here is my stochastic matrix:
[tex]P=\left(<br /> \begin{array}{cccc}<br /> p_0+p_1 & p_2 & p_3 & \ldots \\<br /> p_0 & p_1 & p_2 & \ldots \\<br /> 0 & p_0 & p_1 & \ldots \\<br /> \vdots & \vdots & \vdots & \ddots<br /> \end{array}<br /> \right)[/tex]
From there (my lambda's are your qs), the SS condition is:
[itex]\lambda _0=\left(p_0+p_1\right)\lambda _0+p_0\lambda _1[/itex]
[itex]\lambda _1=p_2\lambda _0+p_1\lambda _1+p_0\lambda _2[/itex]
...
[itex]\lambda _n=\delta _{0n}p_0\lambda _0+\underset{k=0}{\overset{n+1}{\sum }}p_{n+1-k}\lambda _k[/itex]

Solving:
[tex]\lambda _{n+1}=-\delta _{0n}\lambda _0+\frac{\lambda _n}{p_0}-\frac{1}{p_0}\underset{k=0}{\overset{n}{\sum }}p_{n+1-k}\lambda _k[/tex]

Where am I going wrong?
 
No you're right, there was a typo in my formula (though the end result still holds)
 
OK, I've got it. I looked up queueing theory in Norris (Markov Chains) and roughly followed his treatment for for an M/G/1 queue. (That's more general than my problem.) Turns out he begins by discretizing the problem in almost exactly the way I did -- only difference is that he doesn't have a tick for a drop falling into an empty tube (which has to be corrected for later when computing mean volume). The update equation for this Markov chain is

[tex]X_{n+1}=X_n+Y_{n+1}-\mathbb{I}_{X_n>0}[/tex]

Y is Poisson with mean i and [itex]\mathbb{I}_{X_n>0}[/itex] is an indicator that is 1 if there's anything in the tube, 0 otherwise. Define the probability generating function for [itex]X_n[/itex]

[tex]G(z)=\mathbb{E}\left[z^{X_n}\right]=\underset{i=0}{\overset{\infty }{\sum }}\pi_iz^i[/tex]

Then

[tex]zG(z)=\mathbb{E}\left[z^{X_{n+1}+1}\right]=\mathbb{E}\left[z^{X_n+Y_{n+1}+\mathbb<br /> <br /> {I}_{X_n=0}}\right][/tex]
[tex]zG(z)=e^{i(z-1)}\left(\pi_0z+G(z)-\pi_0\right)[/tex]

([itex]\pi_i[/itex] here is steady-state probability than [itex]X_n=i[/itex].) From here you eventually get to

[tex]G(z)=\frac{(i-1) (z-1) e^{i(z-1)}}{e^{i(z-1)}-z}[/tex]

and of course you can differentiate that to get mean X. Since the ticks are not equally spaced in time (the interval after [itex]X_n = 0[/itex] is always longer), and since there's always an extra drop that falls in during that interval, it takes some more fiddling to get to mean volume. But it does, in fact, eventually reduce to [itex]\frac{1}{2}\frac{i}{1-i}[/itex].

I still suspect, when I see such a simple answer emerge after a painfully long calculation, that I missing a more elegant way to get there. But, elegant or no, I have managed to prove what the numerical calculations said.

Thanks for putting me on the right track!
 
pmsrw3 said:
OK, I've got it. I looked up queueing theory in Norris (Markov Chains) and roughly followed his treatment for for an M/G/1 queue. (That's more general than my problem.) Turns out he begins by discretizing the problem in almost exactly the way I did -- only difference is that he doesn't have a tick for a drop falling into an empty tube (which has to be corrected for later when computing mean volume). The update equation for this Markov chain is

[tex]X_{n+1}=X_n+Y_{n+1}-\mathbb{I}_{X_n>0}[/tex]

Y is Poisson with mean i and [itex]\mathbb{I}_{X_n>0}[/itex] is an indicator that is 1 if there's anything in the tube, 0 otherwise. Define the probability generating function for [itex]X_n[/itex]

[tex]G(z)=\mathbb{E}\left[z^{X_n}\right]=\underset{i=0}{\overset{\infty }{\sum }}\pi_iz^i[/tex]

Then

[tex]zG(z)=\mathbb{E}\left[z^{X_{n+1}+1}\right]=\mathbb{E}\left[z^{X_n+Y_{n+1}+\mathbb<br /> <br /> {I}_{X_n=0}}\right][/tex]
[tex]zG(z)=e^{i(z-1)}\left(\pi_0z+G(z)-\pi_0\right)[/tex]

([itex]\pi_i[/itex] here is steady-state probability than [itex]X_n=i[/itex].) From here you eventually get to

[tex]G(z)=\frac{(i-1) (z-1) e^{i(z-1)}}{e^{i(z-1)}-z}[/tex]

and of course you can differentiate that to get mean X. Since the ticks are not equally spaced in time (the interval after [itex]X_n = 0[/itex] is always longer), and since there's always an extra drop that falls in during that interval, it takes some more fiddling to get to mean volume. But it does, in fact, eventually reduce to [itex]\frac{1}{2}\frac{i}{1-i}[/itex].

I still suspect, when I see such a simple answer emerge after a painfully long calculation, that I missing a more elegant way to get there. But, elegant or no, I have managed to prove what the numerical calculations said.

Thanks for putting me on the right track!

That's interesting. I'm guessing you solved for [itex]\pi_0[/itex] by finding the derivative of

[tex]zG(z)=e^{i(z-1)}\left(\pi_0z+G(z)-\pi_0\right)[/tex]

at z=1. If so, the mean could be found without explicitly solving for G by taking the second derivative at z=1.
 
  • #10
bpet said:
That's interesting. I'm guessing you solved for [itex]\pi_0[/itex] by finding the derivative of

[tex]zG(z)=e^{i(z-1)}\left(\pi_0z+G(z)-\pi_0\right)[/tex]

at z=1. If so, the mean could be found without explicitly solving for G by taking the second derivative at z=1.
There's a simpler way to get [itex]\pi_0[/itex]. At every tick, X changes by [itex]Y-\mathbb{I}_{X>0}[/itex]. At steady-state, this has to average to 0. [itex]Y[/itex] has mean [itex]i[/itex], and [itex]\mathbb{I}_{X>0}[/itex] has mean [itex]\mathbb{P}\left[X>0 \right]=1-\pi_0[/itex], so there you have it.

Norris doesn't do it this way, actually. He requires [itex]G(z)=1[/itex] (evident from the definition) and solves for [itex]\pi_0[/itex]. But this gives [itex]G(z)=0/0[/itex] and he has to use L'Hopital's Rule, so it is in essence the same as what you suggest.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
3K
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K