Pedestrian wait time pdf

In summary, the conversation discusses the arrival rate of pedestrians at a signal for crossing a road and the wait time of each pedestrian. The wait time of the first pedestrian is exactly T, while the wait time of subsequent pedestrians is dependent on the arrival time of the first pedestrian. The objective is to find the wait time density functions of all subsequent pedestrians after the first one.
  • #36
Mehmood_Yasir said:
##P(W_k \leq w)=1-P(t_k < T-w)##
##P(W_k \leq w)=\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
Right. That's what you are going to get when you integrate ##\frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}##. So what are your limits of integration? (What limits do you have to evaluate ##\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}## at?)
 
Physics news on Phys.org
  • #37
Mehmood_Yasir said:
I am a bit confused as I am plotting the pdf of ##W_k## and ##t_k## in MATLAB for the above process. For the time vector ##t## on X-axis, I plot the pdf of ##t_k## using its density function ##f_{t_k} (t)= \frac{ {(\lambda t)}^k e^{-\lambda t}} {k!}## for time vector ##0\leq t \leq T##.
I am not sure what you mean here. In the context of this problem, ##t_k## can't be greater than T, so it would not make much sense to plot ##f_{t_k} (t)= \frac{ {(\lambda t)}^k e^{-\lambda t}} {k!}##. Should you be plotting ##f_{t_k|t_k \leq T} (t)## instead?
 
  • #38
tnich said:
Right. That's what you are going to get when you integrate ##\frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}##. So what are your limits of integration?
Limits should be from ##0## to ##T## for integr.

(What limits do you have to evaluate ##\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}## at?)
 
  • #39
Mehmood_Yasir said:
Limits should be from ##0## to ##T## for integr.

(What limits do you have to evaluate ##\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}## at?)
Then what do you get for the value of your integral?
 
  • #40
tnich said:
I am not sure what you mean here. In the context of this problem, ##t_k## can't be greater than T, so it would not make much sense to plot ##f_{t_k} (t)= \frac{ {(\lambda t)}^k e^{-\lambda t}} {k!}##. Should you be plotting ##f_{t_k|t_k \leq T} (t)## instead?
right, yes It is in fact ##f_{t_k|t_k \leq T} (t)## what i am plotting
 
  • #41
Mehmood_Yasir said:
right, yes It is in fact ##f_{t_k|t_k \leq T} (t)## what i am plotting
So you will need to normalize that pdf, too.
 
  • #42
tnich said:
Then what do you get for the value of your integral?
this is a long go..I tried as it looks never ending integral,
 
  • #43
Mehmood_Yasir said:
this is a long go..I tried as it looks never ending integral,
I think you are making it too hard. You already showed that
##\frac{d}{dw} \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!} = \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}##

So ##\int_0^T {\frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}dw}##
##=\left. \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!} \right|_0^T##
 
  • #44
tnich said:
I think you are making it too hard. You already showed that
##\frac{d}{dw} \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!} = \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}##

So ##\int_0^T {\frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}dw}##
##=\left. \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!} \right|_0^T##
by inserting upper limit ##T## in the sum - lower limit ##0## in the sum gives ##\sum_{m=0}^{k-1} \frac{{(\lambda (T))}^m e^{-\lambda (T)}}{m!}##
 
  • #45
Mehmood_Yasir said:
by inserting upper limit ##T## in the sum - lower limit ##0## in the sum gives ##\sum_{m=0}^{k-1} \frac{{(\lambda (T))}^m e^{-\lambda (T)}}{m!}##
there is a minus sign as well, - ##\sum_{m=0}^{k-1} \frac{{(\lambda (T))}^m e^{-\lambda (T)}}{m!}##
 
  • #46
Mehmood_Yasir said:
there is a minus sign as well, - ##\sum_{m=0}^{k-1} \frac{{(\lambda (T))}^m e^{-\lambda (T)}}{m!}##
It is ##C## and since it is not 1. Do I need to multiply ##\frac{1}{C}## with ##\frac d {dw} P(W_k \leq w | W_k \leq T)= \frac{ \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!} }{(1-e^{-\lambda T)}} ##
 
  • #47
Mehmood_Yasir said:
It is ##C## and since it is not 1. Do I need to multiply ##\frac{1}{C}## with ##\frac d {dw} P(W_k \leq w | W_k \leq T)= \frac{ \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!} }{(1-e^{-\lambda T)}} ##
when is it going to end to get the pdf or cdf of ##W_k##?
 
  • #48
Mehmood_Yasir said:

Homework Statement


Pedestrian are arriving to a signal for crossing road with an arrival rate of ##\lambda## arrivals per minute. Whenever the first Pedestrian arrives at signal, he exactly waits for time ##T##, thus we say the first Pedestrian arrives at time ##0##. When time reaches ##T##, light flashes and all other Pedestrians who have arrived within ##T## also cross the road together with the first Pedestrian. Same process repeats.

What is the PDF of wait time of Pedestrians i.e, ##1^{st}##, ##2^{nd}##, ##3^{rd},...,##, pedestrians?

2. Homework Equations

Poisson arrival time density function is ##f_{X_k} (t)=\frac{{(\lambda t)}^{k} e^{-\lambda t}}{k!}=Erlang(k,\lambda; t)## distributed

The Attempt at a Solution


As said, the first pedestrian arrive at time 0 and exactly wait for ##T## time. After time ##T##, all pedestrian arrived at the crossing will cross the street together with the first one.

Now, only see the wait time of each pedestrian to find the density function of wait time.

As the first one wait exactly for time ##T## minutes, thus we can say its wait time density function is just an impulse function shifted at ##T## i.e., ##f_{W1} (t)=\delta (t-T)##.

Since the probability density function of arrival time of the second pedestrian following Poisson process is ##Erlang(1,\lambda)=e^{-\lambda t}## distributed, he waits for ##T-E[X_1|X_1<T]## where ##E[X_1|X_1<T]## is the mean value of the conditional arrival time of the second pedestrian given it is less than ##T##. What is the pdf of wait time of the second pedestrian? Can I say that the arrival time of the second pedestrian follows ##Erlang(1,\lambda; t)=e^{-\lambda t}## distribution, then the wait time pdf should be ##Erlang(1,\lambda; t)=e^{-\lambda (t-(T-E[X_1|X_1<T]))}##. Similarly, what about the third, fourth,... Pedestrians wait time density function?

I think the problem is a lot harder than you realize. Let ##T_0=0, T_1, T_2, \ldots## be the arrival times of customers ##0,1,2, \ldots.## We have that ##T_k \sim \text{Erlang}(\lambda,k),## for ##k \geq 1##, so its density function is
$$ f_k(t) = \frac{\lambda^k t^{k-1} e^{-\lambda t}}{(k-1)!}, \; k=1,2,3, \ldots . $$

Customer #1 waits
$$W_1 = \begin{cases}
T - T_1 & \text{if} \; T_1 < T \\
T & \text{if} \; T_1 > T
\end{cases}
$$
This is true because customer #1 either arrives in the first T-interval, or else starts a new one.

Customer # 2 waits
$$W_2 = \begin{cases}
T-T_2 & \text{if} \; T_2 < T \\
T & \text{if} \; T_1 < T, T_2 > T \\
T_1+T-T_2 & \text{if} \; T_1 > T, T_2 < T_1+T \\
T & \text{if} \; T_1 > T, T_2 > T_1+T
\end{cases}
$$

The cases are: (i) customer #2 comes in the first T-interval; (ii) customer #1 does not start a new T-interval but customer #2 does; (iii) customer #2 falls in a new T-interval started by customer #1; and (iv) customer #2 starts a third T-inverval.

So, already by the time we get to customer #2 the problem becomes complicated.

I will let you play with the cases for customer #3, and worry about later customers.
 
Last edited:
  • #49
Ray Vickson said:
I think the problem is a lot harder than you realize. Let ##T_0=0, T_1, T_2, \ldots## be the arrival times of customers ##0,1,2, \ldots.## We have that ##T_k \sim \text{Erlang}(\lambda,k),## for ##k \geq 1##, so its density function is
$$ f_k(t) = \frac{\lambda^k t^{k-1} e^{-\lambda t}}{(k-1)!}, \; k=1,2,3, \ldots . $$

Hmm, I have been thinking over the night about it, what is your opinion if I say that, consider only a time interval ##0## to ##T##, first pedestrian pushes button to start the process, now forget about this pedestrian. The remaining pedestrians arrive following poisson process with rate ##\lambda##. If arrival time distribution of ##k^{th}## pedestrian is Erlang(##k,\lambda; t##), then the remaining time i.e., time from his his arrival epoch until ##T## should also be Erlang distributed with parameter ##k## and ##\lambda## but with time ##T-t## i.e., Erlang(##k,\lambda; T-t##) distributed. This is only the illustration of the idea that for the remaining time, I am looking from ##T## towards ##0##
 
  • #50
Mehmood_Yasir said:
Hmm, I have been thinking over the night about it, what is your opinion if I say that, consider only a time interval ##0## to ##T##, first pedestrian pushes button to start the process, now forget about this pedestrian. The remaining pedestrians arrive following poisson process with rate ##\lambda##. If arrival time distribution of ##k^{th}## pedestrian is Erlang(##k,\lambda; t##), then the remaining time i.e., time from his his arrival epoch until ##T## should also be Erlang distributed with parameter ##k## and ##\lambda## but with time ##T-t## i.e., Erlang(##k,\lambda; T-t##) distributed. This is only the illustration of the idea that for the remaining time, I am looking from ##T## towards ##0##

As I have indicated in post #48, the true situation is much more complicated than that, so your approach will give an approximation to the true answer. You would need to investigate further to assess the accuracy of the approximation.

On the other hand, you might be able to do something with an "equilibrium" analysis, where you look at the limiting distribution of wait times ##W_n## for customers with large ##n## For such customers far in the future, all the little details of entangled waiting times for customers ##1,2,3,\ldots ## will likely become unimportant, and the problem can be looked at much more simply. For example, you could look at the conditional distribution of waiting time for customer ##n##, given that customer ##n## is part of a group of ##k+1## customers who cross together. You could weight that by the probability of such a group, which you have found already.
 
  • #51
Ray Vickson said:
As I have indicated in post #48, the true situation is much more complicated than that, so your approach will give an approximation to the true answer. You would need to investigate further to assess the accuracy of the approximation.

On the other hand, you might be able to do something with an "equilibrium" analysis, where you look at the limiting distribution of wait times ##W_n## for customers with large ##n## For such customers far in the future, all the little details of entangled waiting times for customers ##1,2,3,\ldots ## will likely become unimportant, and the problem can be looked at much more simply. For example, you could look at the conditional distribution of waiting time for customer ##n##, given that customer ##n## is part of a group of ##k+1## customers who cross together. You could weight that by the probability of such a group, which you have found already.
Actually I have not that much idea how to assess the accuracy of approximation, need to explore further, but I just developed a program that runs this process for 1 million time, interesting the empirical cdf of ##W_k## matches cdf of my approximation. would it be enough to verify the approximation.?
 
  • #52
Mehmood_Yasir said:
Actually I have not that much idea how to assess the accuracy of approximation, need to explore further, but I just developed a program that runs this process for 1 million time, interesting the empirical cdf of ##W_k## matches cdf of my approximation. would it be enough to verify the approximation.?

Verifying approximations by comparison to simulation has a long and honorable tradition in Stochastic modelling---especially when exact computations are virtually impossible. However, I am not able to judge if your method gives a vaiid assessment because you have not said enough about how you set up the simulations.

After thinking about it some more, I can see how one could compute an exact distribution for the waiting time of customers 1,2,3, ..., as well as limiting waiting time distributions for customers far in the future. (By compute, I mean compute numerically, not necessarily by getting a nice formula!)

The waiting-time distribution is mixed discrete-continuous, with a continuous part for ##0 < W < T## and a discrete part ##P(W = T)## (if the customer starts a new T-period). We can model this system as a discrete-parameter, continuous-state Markov chain: the state-space is ##(0,T]##. Let ##W_n## denote the wait-time random variable of the ##n##th customer, ##C_n.## The relationship between the wait-times ##W_n## and ##W_{n+1}## can be obtained from the fact that the inter-arrival time between customers ##C_n## and ##C_{n+1}## is ##X \sim \text{expl} (\lambda),## independent of the past. This implies
$$W_{n+1} = \begin{cases}
W_n - X & \text{if} \: X < W_n \\
T & \text{if} \; X > W_n
\end{cases}
$$
This just says that either customer ##C_{n+1}## comes in the same ##T##-interval as customer ##C_n## (and waits for less time than does ##C_n##), or else ##C_{n+1}## starts a new ##T##-interval and thus waits for time ##T##.

The dynamics of ##\{W_n\}## is about as simple as it can get, and all the difficulties cited in post #48 simply go away---just by changing the point-of-view and exploiting the "memoryless" property of Poisson arrivals.

For actual computation it is probably best to "discretize" the problem, say by splitting up the waiting-time interval ##(0,T]## into ##N## parts ##\{T/N, 2T/N, \ldots, NT/N = T \}## and making the customer inter-arrival times be geometric with parameter ##p = \lambda T/N##. That turns the continuous-state Markov process into a stationary, ##N##-state Markov chain. Numerous efficient methods are available for computing almost any imaginable thing about such processes.
 
Last edited:
  • #53
Mehmood_Yasir said:
when is it going to end to get the pdf or cdf of ##W_k##?
I had not realized that you did not come to a conclusion in this thread. The expression you have been using for C is not correct. You are still assuming that the waiting time of the kth pedestrian is exponential, though we have shown that it is not. I showed you a way to determine C in post #43, though I don't think you recognized that. Once you have C, you can complete your equation for ##f_{W_k}(t)##, the pdf of the waiting time of the kth pedestrian to arrive. This is what you need for the new thread you have started about the waiting time of the last pedestrian to arrive.

I see that @Ray Vickson has proposed a different approach to solving the problem. Perhaps he is interpreting the question differently. I think the interpretation we have been using is the correct one, since it leads directly to the solution of the next part of the problem posted at https://www.physicsforums.com/forums/calculus-and-beyond-homework.156/.

I would suggest that in the future, you post the whole problem instead of dribbling it out bit by bit. That would make it much easier for the people trying to help you to correctly interpret the problem statement.
 
  • #54
tnich said:
I had not realized that you did not come to a conclusion in this thread. The expression you have been using for C is not correct. You are still assuming that the waiting time of the kth pedestrian is exponential, though we have shown that it is not. I showed you a way to determine C in post #43, though I don't think you recognized that. Once you have C, you can complete your equation for ##f_{W_k}(t)##, the pdf of the waiting time of the kth pedestrian to arrive. This is what you need for the new thread you have started about the waiting time of the last pedestrian to arrive.

I see that @Ray Vickson has proposed a different approach to solving the problem. Perhaps he is interpreting the question differently. I think the interpretation we have been using is the correct one, since it leads directly to the solution of the next part of the problem posted at https://www.physicsforums.com/forums/calculus-and-beyond-homework.156/.

I would suggest that in the future, you post the whole problem instead of dribbling it out bit by bit. That would make it much easier for the people trying to help you to correctly interpret the problem statement.

I am interpreting the question exactly as stated in post #1, which wanted to find the waiting-time distributions for customers 1, 2, 3, 4, ... . A Markov process model works perfectly for this purpose: we can find customer waiting-times, but not customer arrival times or other details. The simple Markov process model just keeps waiting-time information and dispenses with everything else. Again, the logic is very simple: customer 0 starts a new T-interval, so ##W_0 = T##. Customer #1 either arrives in the same T-interval, or else starts a new T-interval of his own. So, if ##X## is the inter-arrival time between customers 0 and 1, we have ##W_1 = W_0 - X## if ##X < W_0## (same T-interval, but less waiting), or else ##W_1 = T## if ##X > W_0## (starts a new T-interval). It works the same way for any two successive customers ##n## and ##n+1##.

I have looked at a bunch of examples, by discretizing the Markov process, so we have a geometric inter-arrival distribution instead of exponential. Of course, the first waiting-time distribution is like a reversed-exponential but with a point-mass at ##W = T## (NOT a re-normalized exponential on the interval!). Depending on the size of ##T## relative to ##1/\lambda##, the second waiting-time distribution resembles a reversed 2-Erlang, but with some differences caused by the "overspill" effects when customers start new T-intervals. By the time the customer number reaches approximately the mean number of arrivals that would have occurred in interval T, the distributions start to deviate wildly from Erlangs or reversed Erlangs.

For later and later customers the waiting-time distributions become more nearly a mix of a uniform distribution on ##(0,T)## and a finite probability mass at ##W = T##. This in in accordance with what was found in the other thread, where we finally settled on the fact that in the long-run, there is a probability of ##P_0 = 1/(1 + \lambda T)## for a customer to arrive when there are no others waiting (and so starts a new T-interval, hence waits for time ##W = T##), or else there is a probability ##P_1 = \lambda T/(1+\lambda T)## to join other customers. One can argue intuitively that (again, in the long-run) when a customer arrival is part of a group that have arrived in a common T-interval, their conditional arrival times are actually uniform over the interval (another magic property of Poisson processes).
 
  • #55
Ray Vickson said:
One can argue intuitively that (again, in the long-run) when a customer arrival is part of a group that have arrived in a common T-interval, their conditional arrival times are actually uniform over the interval (another magic property of Poisson processes).

Yes. Conditioning on the number of arrivals = n in some fixed time period ##t## gives rises to uniform random variables, and a ton of symmetry...
 
  • #56
Ray Vickson said:
I am interpreting the question exactly as stated in post #1, which wanted to find the waiting-time distributions for customers 1, 2, 3, 4, ... . A Markov process model works perfectly for this purpose: we can find customer waiting-times, but not customer arrival times or other details. The simple Markov process model just keeps waiting-time information and dispenses with everything else. Again, the logic is very simple: customer 0 starts a new T-interval, so ##W_0 = T##. Customer #1 either arrives in the same T-interval, or else starts a new T-interval of his own. So, if ##X## is the inter-arrival time between customers 0 and 1, we have ##W_1 = W_0 - X## if ##X < W_0## (same T-interval, but less waiting), or else ##W_1 = T## if ##X > W_0## (starts a new T-interval). It works the same way for any two successive customers ##n## and ##n+1##.

I have looked at a bunch of examples, by discretizing the Markov process, so we have a geometric inter-arrival distribution instead of exponential. Of course, the first waiting-time distribution is like a reversed-exponential but with a point-mass at ##W = T## (NOT a re-normalized exponential on the interval!). Depending on the size of ##T## relative to ##1/\lambda##, the second waiting-time distribution resembles a reversed 2-Erlang, but with some differences caused by the "overspill" effects when customers start new T-intervals. By the time the customer number reaches approximately the mean number of arrivals that would have occurred in interval T, the distributions start to deviate wildly from Erlangs or reversed Erlangs.

For later and later customers the waiting-time distributions become more nearly a mix of a uniform distribution on ##(0,T)## and a finite probability mass at ##W = T##. This in in accordance with what was found in the other thread, where we finally settled on the fact that in the long-run, there is a probability of ##P_0 = 1/(1 + \lambda T)## for a customer to arrive when there are no others waiting (and so starts a new T-interval, hence waits for time ##W = T##), or else there is a probability ##P_1 = \lambda T/(1+\lambda T)## to join other customers. One can argue intuitively that (again, in the long-run) when a customer arrival is part of a group that have arrived in a common T-interval, their conditional arrival times are actually uniform over the interval (another magic property of Poisson processes).
I understand your argument, and I have no evidence to offer that your interpretation is not correct. I assumed, perhaps incorrectly, that the question was asking for the waiting time of the kth customer to arrive within the interval [0,T].
 
  • #57
Mehmood_Yasir said:
Just to be clear, for any particular value of ##\lambda##, let's assume ##m=4## more pedestrian arrived in time ##T## and all 5 (including the first) crossed the street after ##T##. Now the wait time of first is clear as T and thus also its wait time density function. My objective is to find the wait time density functions of all subsequent Pedestrians which are 2nd, 3rd, 4th and 5th in above assumption. In general speaking now this ##m## can take any value as it is unconditional in my problem. But I am atleast interested to find the density function of wait time of 2nd, 3rd, and 4th subsequent pedestrians after the first pedestrian. As you mentioned about wait time of any ##k^{th}## Pedestrian, than ##1<k\leq m##, referring wait time of all subsequent pedestrians except first, as first I already know.

This makes me think again for my own problem .amazing. as I said explicitly in above post#3 with an example that assuming 4 pedestrian arrive within the time window ##T## which is started by the first one. So, I am looking to find the waiting time distributions of pedestrians who are now going to cross after ##T## which means 2nd, 3rd, 4th and 5th in above example, as the first one I ignored as He sees always ##T## time that results an impulse function shifted by ##T## as his waiting time distribution function, its pretty simple, and its true for all those ##k## who will kick ##T## in the future. When I said impulse function shifted by ##T## it exactly reflect the time he will now wait until light is flashed and this is his waiting time which is ##T##. In the interpretation of @tnick ##k## customer is the one who is going to cross the road once light is flashed. And this ##k## can take number 2,3,4,5,... and the maximum of value of ##k## or how many of them are going to cross, I am least interested as it is dependent on ##\lambda## or ##T## length. By memory-less property of Poisson process, the process starts again and its a fresh start, by the first one to kick ##T## and rest process follows same except that we may have a different max##(k)## but by sticking to one arrival time distribution or inter arrival time distribution, I wonder why yours interpretations are different. I think you both are with same target, but with a different approach. But if you say
tnich said:
that the question was asking for the waiting time of the kth customer to arrive within the interval [0,T].
that is not right, as waiting time of ##k## means how long will he stay now until light is flashed, and he will cross, which is in general waiting time. The time ##k## will take to arrive within ##(0,T]## on average and over long run is erlang(k) distributed with the condition it is less than ##T##, that I am pretty sure.
 
Last edited:

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
32
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
674
  • Calculus and Beyond Homework Help
Replies
10
Views
982
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
13
Views
1K
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
12
Views
2K
Back
Top