Probability and pedestrian wait time density function

In summary: T)}}####= 1-{lim_{n\to\infty}\frac{T} {T+...+T}}####= 1-\frac{T} {lim_{n\to\infty}(n)}####= 1-\frac{T} {∞}####= 0##In summary, the probability that a randomly arriving pedestrian has crossed the crossing in a group of ##k+1## pedestrians is given by ##\frac{(K+1) {(\lambda * T)}^k e^{-\lambda T} } {k! (1+\lambda T)}##. This can be derived from the Poisson formula for general ##k## arrivals in time ##
  • #1
Mehmood_Yasir
68
2

Homework Statement


Pedestrians approach to a signal at the crossing in a Poisson manner with arrival rate ##\lambda## arrivals per minute. The first pedestrian arriving the signal starts a timer ##T## then waits for time ##T##. A light is flashed after time T, and all waiting pedestrians who arrive within time duration ##T## must cross.

What is the probability that a randomly arriving padestrian has crossed the crossing in a group of ##k+1## padestrians? Here is the group is called ##k+1## because first one starts ##T## and ##k## more padestrian arrive within time ##T##.

The answer given is ##\frac{(K+1) {(\lambda * T)}^k e^{-\lambda T} } {k! (1+\lambda T)}##

I could not understand this answer. Can someone kindly explain me this answer.

Homework Equations


Poisson formula for general ##k## arrivals in time ##T##,
##P_k= \frac{{(\lambda * T)}^k e^{-\lambda T} } {k! }##

The Attempt at a Solution


Since the total padestrian which has crossed the crossing after light is flashed = ##k+1##

We know that for a Poisson process, the probability of ##k## arrivals in a given time interval ##T## is ## \frac{{(\lambda * T)}^k e^{-\lambda T} } {k! }##.

Probability that ##k+1## has crossed the crossing is ## \frac{{(\lambda T)}^{k+1} e^{-\lambda T} } {(k+1)! }##

What is the probability that a randomly arriving padestrian has crossed the crossing in a group of ##(k+1)## ??


 
Physics news on Phys.org
  • #2
Mehmood_Yasir said:
Here is the group is called ##k+1## because first one starts ##T## and ##k## more padestrian arrive within time ##T##.
If the first pedestrian arrives at time 0 and k more pedestrians arrive in the interval (0, T], how many pedestrians cross together at time T?
 
  • #3
tnich said:
If the first pedestrian arrives at time 0 and k more pedestrians arrive in the interval (0, T], how many pedestrians cross together at time T?
Let's step through it. Suppose you are the pedestrian in question. What is the probability that you are one who pushes the button?
 
  • #4
tnich said:
If the first pedestrian arrives at time 0 and k more pedestrians arrive in the interval (0, T], how many pedestrians cross together at time T?
##K+1##
 
  • #5
tnich said:
Let's step through it. Suppose you are the pedestrian in question. What is the probability that you are one who pushes the button?
Then, should it be ##\frac{1}{K+1}##? because total are ##K+1##. Probability that I push the button/or Timer should be then ##\frac{1}{K+1}##
 
  • #6
Mehmood_Yasir said:
Then, should it be ##\frac{1}{K+1}##?
No, that's not what I was thinking of. For a Poisson arrival process with arrival rate λ, the distribution of the time to the first arrival (and the time between successive arrivals) has pdf ##f(t) = λe^{-λt}##. If you are going to be the one to press the button, you have to arrive before anyone else does. So on average, what is the amount of time between a group of pedestrians crossing and the next pedestrian arriving?
 
Last edited:
  • #7
tnich said:
No, that's not what I was thinking of. For a Poisson arrival process with arrival rate λ, the distribution of the time to the first arrival (and the time between successive arrivals) has pdf ##f(t) = \frac1λe^{-λt}##.
Yes, this is correct.
tnich said:
If you are going to be the one to press the button, you have to arrive before anyone else does. So on average, what is the amount of time between a group of pedestrians crossing and the next pedestrian arriving?
By independence, the amount of time between a group of pedestrians crossing and the next pedestrian arriving should be exactly Erlang##(1,\lambda)## distributed with density function ##f(t)=e^{-\lambda T}## meaning no arrival occurred before me. The expected amount of time will be then ##\frac{1}{\lambda}##. Am I right?
 
  • #8
Mehmood_Yasir said:
Yes, this is correct.

By independence, the amount of time between a group of pedestrians crossing and the next pedestrian arriving should be exactly Erlang##(1,\lambda)## distributed with density function ##f(t)=e^{-\lambda T}## meaning no arrival occurred before me. The expected amount of time will be then ##\frac{1}{\lambda}##. Am I right?
Yes. Now consider that the probability that you press the button is the probability that you arrive in one of those small intervals between a group of pedestrians crossing and the next pedestrian arriving. If you consider cycles of the walk signal, then the time it takes for n cycles would be ##\sum_{k=1}^n(t_k + T)##, where ##t_k## is the time until the first arrival in cycle k, and T is the time from pushing the button until the end of the cycle. Look at
##1-\frac{T}{lim_{n\to\infty} \frac1n\sum_{k=1}^n(t_k + T)}##. That would be the probability of being the one to push the button. Use the law of large numbers to simplify it.
 
  • #9
Mehmood_Yasir said:
What is the probability that a randomly arriving padestrian has crossed the crossing in a group of ##k+1## padestrians? The answer given is ##\frac {(K+1){(\lambda T)}^{k} e^{−\lambda T}} {k!(1+\lambda T)}##

OK. I got your point, I did not solve yet using law of large numbers. But, is it the same probability what I referred in the beginning of question? i.e., What is the probability that a randomly arriving padestrian has crossed the street crossing in a group of ##k+1## padestrians. Given answer for the question is ##\frac {(K+1){(\lambda T)}^{k} e^{−\lambda T}} {k!(1+\lambda T)}##?
 
  • #10
tnich said:
Yes. Now consider that the probability that you press the button is the probability that you arrive in one of those small intervals between a group of pedestrians crossing and the next pedestrian arriving. If you consider cycles of the walk signal, then the time it takes for n cycles would be ##\sum_{k=1}^n(t_k + T)##, where ##t_k## is the time until the first arrival in cycle k, and T is the time from pushing the button until the end of the cycle. Look at
##1-\frac{T}{lim_{n\to\infty} \frac1n\sum_{k=1}^n(t_k + T)}##. That would be the probability of being the one to push the button. Use the law of large numbers to simplify it.
Actually, back that up a step or two and it may be more obvious.
##P(##pushing button##)= 1-{lim_{n\to\infty}\frac{nT} {\sum_{k=1}^n(t_k + T)}}##
 
  • #11
Mehmood_Yasir said:
OK. I got your point, I did not solve yet using law of large numbers. But, is it the same probability what I referred in the beginning of question? i.e., What is the probability that a randomly arriving padestrian has crossed the street crossing in a group of ##k+1## padestrians. Given answer for the question is ##\frac {(K+1){(\lambda T)}^{k} e^{−\lambda T}} {k!(1+\lambda T)}##?
Nope. Just a step along the way. Try the LOLN.
 
  • #12
I have to leave in a few minutes, so I'll give the you outline of the rest. Once you apply the LOLN and use your result from post #7 for the mean time to the first arrival, you can break the original problem into two cases, 1) you arrive first, and 2) someone else arrives first. Call the time of first arrival time 0.

In case 1), how many people have to arrive in the interval (0, T] to for k+1 people to cross at the end of the interval? (You wouldn't count yourself in that number because you have already arrived.) What is the probability of that? Call it ##P(##crossing in a group of K+1##|##arrived first##)##.

In case 2) you arrive after time 0. Someone else arrived at time 0. How many people must arrive in the interval (0, T] for k+1 people to cross at the end of the interval. (You would not include yourself in that number either since you already counted yourself.) What is the probability of that? Call it ##P(##crossing in a group of K+1##|##did not arrive first##)##.

Now simplify the equation
##P(##crossing in a group of K+1##) = ##
##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) ##
##+P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)##
 
  • Like
Likes Mehmood_Yasir
  • #13
tnich said:
I have to leave in a few minutes, so I'll give the you outline of the rest. Once you apply the LOLN and use your result from post #7 for the mean time to the first arrival, you can break the original problem into two cases, 1) you arrive first, and 2) someone else arrives first. Call the time of first arrival time 0.

In case 1), how many people have to arrive in the interval (0, T] to for k+1 people to cross at the end of the interval? (You wouldn't count yourself in that number because you have already arrived.) What is the probability of that? Call it ##P(##crossing in a group of K+1##|##arrived first##)##.

In case 2) you arrive after time 0. Someone else arrived at time 0. How many people must arrive in the interval (0, T] for k+1 people to cross at the end of the interval. (You would not include yourself in that number either since you already counted yourself.) What is the probability of that? Call it ##P(##crossing in a group of K+1##|##did not arrive first##)##.

Now simplify the equation
##P(##crossing in a group of K+1##) = ##
##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) ##
##+P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)##
ok..Let me solve it and I will come back to you. Thank you very much in advance.
 
  • #14
tnich said:
Yes. Now consider that the probability that you press the button is the probability that you arrive in one of those small intervals between a group of pedestrians crossing and the next pedestrian arriving. If you consider cycles of the walk signal, then the time it takes for n cycles would be ##\sum_{k=1}^n(t_k + T)##, where ##t_k## is the time until the first arrival in cycle k, and T is the time from pushing the button until the end of the cycle. Look at
##1-\frac{T}{lim_{n\to\infty} \frac1n\sum_{k=1}^n(t_k + T)}##. That would be the probability of being the one to push the button. Use the law of large numbers to simplify it.
Mehmood_Yasir said:
ok..Let me solve it and I will come back to you. Thank you very much in advance.
tnich said:
I have to leave in a few minutes, so I'll give the you outline of the rest. Once you apply the LOLN
I tried it, let me know the way I did is wrong or right. By using LOLN,
##=1-\frac{T}{lim_{n\to\infty} \frac1n\sum_{k=1}^n(t_k + T)}##
##=1-\frac{T} { lim_{n\to\infty} \frac1n \Big( (t_1+t_2+...+t_n) \Big) + lim_{n\to\infty} nT }##
##=\frac{1} {1+ \lambda T }##

tnich said:
and use your result from post #7 for the mean time to the first arrival, you can break the original problem into two cases, 1) you arrive first, and 2) someone else arrives first. Call the time of first arrival time 0.

In case 1), how many people have to arrive in the interval (0, T] to for k+1 people to cross at the end of the interval? (You wouldn't count yourself in that number because you have already arrived.) What is the probability of that? Call it ##P(##crossing in a group of K+1##|##arrived first##)##.
Since ##K+1## pedestrian will cross the street crossing after ##T## times where the first one arrive at 0, and ##k## more arrive in ##T## time. Thus, I think here ##P(##crossing in a group of K+1##|##arrived first##)## should be ##P(##K more arrivals in ##T)## that will be ##\frac{e^{-\lambda T} {(\lambda T)}^k }{k!}##

tnich said:
In case 2) you arrive after time 0. Someone else arrived at time 0. How many people must arrive in the interval (0, T] for k+1 people to cross at the end of the interval. (You would not include yourself in that number either since you already counted yourself.) What is the probability of that? Call it ##P(##crossing in a group of K+1##|##did not arrive first##)##.
I think again it should be simply ##P(##K more arrivals in ##T)## that will be ##\frac{e^{-\lambda T} {(\lambda T)}^k }{k!}##

tnich said:
Now simplify the equation
##P(##crossing in a group of K+1##) = ##
##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) ##
##+P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)##

##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) = \frac {e^{-\lambda T} {(\lambda T)}^k } {k!} \frac{1} {1+ \lambda T } ##
##P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)= \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} \frac{k} {1+ \lambda T }##
##P(##crossing in a group of K+1##) = \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} \frac{1} {1+ \lambda T } + \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} \frac{k} {1+ \lambda T }##
##P(##crossing in a group of K+1##) = \frac{k+1} {1+ \lambda T } \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} ##

If this is correct. Then my another question was related to the density function of pedestrian wait time.
As said, the first pedestrian arrive at time 0 and push button for a duration of ##T##. After time ##T##, all pedestrian arrived at the crossing will cross the street including the first one. Now, only see the wait time of each pedestrian to find the density function of wait time. As the first one wait exactly for time ##T## minutes, thus we can say its wait time density function is just an impulse function shifted at ##T## i.e., ##f_{W1} (t)=\delta (t-T)##.

Since the density function of arrival time of the second pedestrian following Poisson process is ##Erlang(1,\lambda)=e^{-\lambda T}## distributed, what is the wait time density function of the second pedestrian? The second pedestrian waits for ##T-E[X_1]## where ##E[X_1]## is the mean value of the arrival time of the second pedestrian after the clock is started.
 
Last edited:
  • #15
Mehmood_Yasir said:
I tried it, let me know the way I did is wrong or right. By using LOLN,
##=1-\frac{T}{lim_{n\to\infty} \frac1n\sum_{k=1}^n(t_k + T)}##
##=1-\frac{T} { lim_{n\to\infty} \frac1n (t_1+t_2+...+t_n) + lim_{n\to\infty} nT }##
This step is not quite right. What does the LOLN tell you about ##lim_{n\to\infty} \frac1n (t_1+t_2+...+t_n) ##?
What is ##\frac 1 n \sum_{k=1}^n T##?

Mehmood_Yasir said:
##=\frac{1} {1+ \lambda T }##Since ##K+1## pedestrian will cross the street crossing after ##T## times where the first one arrive at 0, and ##k## more arrive in ##T## time. Thus, I think here ##P(##crossing in a group of K+1##|##arrived first##)## should be ##P(##K more arrivals in ##T)## that will be ##\frac{e^{-\lambda T} {(\lambda T)}^k }{k!}##
This part is right.

Mehmood_Yasir said:
I think again it should be simply ##P(##K more arrivals in ##T)## that will be ##\frac{e^{-\lambda T} {(\lambda T)}^k }{k!}##
This is not right for case 2. You have already counted the person who arrived first and pushed the button, and yourself. The other arrivals within the interval (0,T] are independent of your arrival and the total number of people crossing needs to be k + 1.

Mehmood_Yasir said:
##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) = \frac {e^{-\lambda T} {(\lambda T)}^k } {k!} \frac{1} {1+ \lambda T } ##
##P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)= \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} \frac{k} {1+ \lambda T }##
This can't be right. If the probability that you arrived first is ##\frac{1} {1+ \lambda T }##, what is the probability that you didn't arrive first?
 
  • #16
tnich said:
This step is not quite right. What does the LOLN tell you about ##lim_{n\to\infty} \frac1n (t_1+t_2+...+t_n) ##?
What is ##\frac 1 n \sum_{k=1}^n T##?
##\frac 1 n \sum_{k=1}^n T## should be ##\frac 1 n (n T)=T##.
##lim_{n\to\infty} \frac1n (t_1+t_2+...+t_n) ##, I think will approach to ##\frac{1}{\lambda}## as ##n\to\infty##.

tnich said:
This is not right for case 2. You have already counted the person who arrived first and pushed the button, and yourself. The other arrivals within the interval (0,T] are independent of your arrival and the total number of people crossing needs to be k + 1.
Isn't it the probability of ##k-1## more arrivals in time ##T##= ##\frac{{(\lambda T)}^{k-1} e^{-\lambda T}}{(k-1)!}##
tnich said:
This can't be right. If the probability that you arrived first is ##\frac{1} {1+ \lambda T }##, what is the probability that you didn't arrive first?
This should be then ##1- \frac{1} {1+ \lambda T }##
 
Last edited:
  • #17
Mehmood_Yasir said:
##\frac 1 n \sum_{k=1}^n T## should be ##\frac 1 n (n T)=T##.
##lim_{n\to\infty} \frac1n (t_1+t_2+...+t_n) ##, I think will approach to ##\frac{1}{\lambda}## as ##n\to\infty##.Isn't it the probability of ##k-1## more arrivals in time ##T##= ##\frac{{(\lambda T)}^{k-1} e^{-\lambda T}}{(k-1)!}##

This should be then ##1- \frac{1} {1+ \lambda T }##
Yes. So how do you get from there to P(crossing in a group of k+1)?
 
  • #18
tnich said:
Yes. So how do you get from there to P(crossing in a group of k+1)?
##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) = \frac {e^{-\lambda T} {(\lambda T)}^k } {k!} \frac{1} {1+ \lambda T } ##
##P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)= \frac{e^{-\lambda T} {(\lambda T)}^{k-1} }{(k-1)!} (1-\frac{1} {1+ \lambda T })##
By adding above expressions, I came to this expression,
##P(##crossing in a group of K+1##) = \frac{k+1} {1+ \lambda T } \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} ##
which is right I think now. am i right?
 
  • #19
Mehmood_Yasir said:
##P(##crossing in a group of K+1##|##arrived first##)P(##arrived first##) = \frac {e^{-\lambda T} {(\lambda T)}^k } {k!} \frac{1} {1+ \lambda T } ##
##P(##crossing in a group of K+1##|##did not arrive first##)P(##did not arrive first##)= \frac{e^{-\lambda T} {(\lambda T)}^{k-1} }{(k-1)!} (1-\frac{1} {1+ \lambda T })##
By adding above expressions, I came to this expression,
##P(##crossing in a group of K+1##) = \frac{k+1} {1+ \lambda T } \frac{e^{-\lambda T} {(\lambda T)}^k }{k!} ##
which is right I think now. am i right?
Now it's just an algebra problem. If you work through the algebra to the final solution, you will know whether it's right.
 
  • #20
tnich said:
Now it's just an algebra problem. If you work through the algebra to the final solution, you will know whether it's right.
I just completed, its right. thank you very much, the way you taught me, was really nice. I really learned couple of more thinks while coming to the end of this proof. thank you again.
 
  • #21
May I ask another question related to the same problem? i.e., related to the density function of pedestrian wait time. I also mentioned in another thread.
As said, the first pedestrian arrive at time 0 and push button for a duration of ##T##. After time ##T##, all pedestrian arrived at the crossing will cross the street including the first one. Now, only consider the wait time of each pedestrian to find their waiting time density function. As the first one wait exactly for time ##T## minutes, thus we can say its wait time density function is just an impulse function shifted at ##T## i.e., ##f_{W1} (t)=\delta (t-T)##.

For second Pedestrian, the density function of arrival time following Poisson process is ##Erlang(1,\lambda; t)=e^{-\lambda t}## distributed, therefore, the second pedestrian will wait for ##T-E[X_1|X_1\leq T]## time where ##E[X_1|X_1\leq T]## is the mean value of the conditional arrival time of the second pedestrian after the clock is started. What is the wait time density function of the second pedestrian? My idea: since the arrival time density function is ##f_{X2} (t)=e^{-\lambda t}##, the wait time density function will be just arrival time density function shifted with wait time which is ##T-E[X_1|X_1\leq T]##. Thus wait time density function will be ##f_{W2} (t)=e^{-\lambda (t-(T-E[X_1|X_1\leq T]))}##. is it correct? Similarly, what about the third and fourth pedestrian?
 
Last edited:
  • #22
Since the problem is solved, I'll mention a much simpler approach is to explicitly set this up as a Bayes problem. To cut to the chase and simplify things, I'm ignoring the +1 at the start. (Easy justification: it isn't too hard to adjust this to include a +1, but the result is cleaner here, and/or it does not matter for large k.)

So ignoring the +1 that kickstarts the whole thing, the Bayes approach is:
- - -

##\text{prior} = \begin{bmatrix}
p_{\lambda}(k=0, t)\\
p_{\lambda}(k=1, t)\\
p_{\lambda}(k=2, t)\\
\vdots\\
p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

##\text{likelihood function} =
\begin{bmatrix}
0\\
1\\
2\\
\vdots\\
k\\
\vdots\\
\end{bmatrix}####\text{posterior} \propto \text{likelihood function} \circ \text{prior} = \begin{bmatrix}
(0)p_{\lambda}(k=0, t)\\
(1)p_{\lambda}(k=1, t)\\
(2)p_{\lambda}(k=2, t)\\
\vdots\\
(k)p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix} ##

where ##\circ## denotes element wise multiplication

the posterior needs normalized. We immediately recognize that its sum gives the mean of a Poisson, i.e. ##(\lambda t)##.

Hence

##\text{posterior} = \begin{bmatrix}
0\\
\frac{1}{\lambda t} p_{\lambda}(k=1, t)\\
\frac{2}{\lambda t}p_{\lambda}(k=2, t)\\
\vdots\\
\frac{k}{\lambda t}p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

and there's the result.
 
  • #23
StoneTemplePython said:
Since the problem is solved, I'll mention a much simpler approach is to explicitly set this up as a Bayes problem. To cut to the chase and simplify things, I'm ignoring the +1 at the start. (Easy justification: it isn't too hard to adjust this to include a +1, but the result is cleaner here, and/or it does not matter for large k.)
You left me in the dust there. Which problem are you solving? (There is another thread open for the waiting time problem, which is not solved yet. https://www.physicsforums.com/threads/pedestrian-wait-time-pdf.946340/#post-5989691)
 
  • #24
tnich said:
You left me in the dust there. Which problem are you solving? (There is another thread open for the waiting time problem, which is not solved yet. https://www.physicsforums.com/threads/pedestrian-wait-time-pdf.946340/#post-5989691)
I think he referred another approach to get to the solution of probability that a randomly arriving pedestrian has crossed the road in the group of ##k+1## by ignoring the first pedestrian who pushes the button.
 
  • #25
tnich said:
You left me in the dust there. Which problem are you solving? (There is another thread open for the waiting time problem, which is not solved yet. https://www.physicsforums.com/threads/pedestrian-wait-time-pdf.946340/#post-5989691)

Just trying to give another way of getting to the original result of
##\frac{(K+1) {(\lambda * T)}^k e^{-\lambda T} } {k! (1+\lambda T)} = \frac{(K+1) p_{\lambda}(k=k, t)} { (1+\lambda T)} ##

but to cut to the chase, I solved the easier problem of ignoring the +1 at time zero -- i.e. I showed

##= \frac{(K) p_{\lambda}(k=k, t)} { (\lambda T)}## which you can see in the k + 1 spot in the below posterior vector

##\text{posterior} = \begin{bmatrix}
0\\
\frac{1}{\lambda t} p_{\lambda}(k=1, t)\\
\frac{2}{\lambda t}p_{\lambda}(k=2, t)\\
\vdots\\
\frac{k}{\lambda t}p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

(I mostly try to reserve capital letters for random variables but otherwise the notation lines up). The fact that Poisson is memoryless allows for a clean attack for "random incidence" problems -- in particular it allows you to skip some subtleties about mixing times or limits with uniform distributions.

Mehmood_Yasir said:
I think he referred another approach to get to the solution of probability that a randomly arriving pedestrian has crossed the road in the group of ##k+1## by ignoring the first pedestrian who pushes the button.

This was my intent. For some people a Bayes approach is extremely intuitive. For a lot of people it isn't very intuitive at all.
 
  • #26
StoneTemplePython said:
Since the problem is solved, I'll mention a much simpler approach is to explicitly set this up as a Bayes problem. To cut to the chase and simplify things, I'm ignoring the +1 at the start. (Easy justification: it isn't too hard to adjust this to include a +1, but the result is cleaner here, and/or it does not matter for large k.)

So ignoring the +1 that kickstarts the whole thing, the Bayes approach is:
- - -

##\text{prior} = \begin{bmatrix}
p_{\lambda}(k=0, t)\\
p_{\lambda}(k=1, t)\\
p_{\lambda}(k=2, t)\\
\vdots\\
p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

##\text{likelihood function} =
\begin{bmatrix}
0\\
1\\
2\\
\vdots\\
k\\
\vdots\\
\end{bmatrix}####\text{posterior} \propto \text{likelihood function} \circ \text{prior} = \begin{bmatrix}
(0)p_{\lambda}(k=0, t)\\
(1)p_{\lambda}(k=1, t)\\
(2)p_{\lambda}(k=2, t)\\
\vdots\\
(k)p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix} ##

where ##\circ## denotes element wise multiplication

the posterior needs normalized. We immediately recognize that its sum gives the mean of a Poisson, i.e. ##(\lambda t)##.

Hence

##\text{posterior} = \begin{bmatrix}
0\\
\frac{1}{\lambda t} p_{\lambda}(k=1, t)\\
\frac{2}{\lambda t}p_{\lambda}(k=2, t)\\
\vdots\\
\frac{k}{\lambda t}p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

and there's the result.

You have left me behind here: I don't understand your prior (prior of what?). I don't see where the likelihood vector comes from (likelihood of what?)

I used to teach this stuff in a graduate level operations research program, and over the years I did a lot of Bayesian stuff, but this has me stumped!
 
  • #27
Ray Vickson said:
You have left me behind here: I don't understand your prior (prior of what?). I don't see where the likelihood vector comes from (likelihood of what?)

I used to teach this stuff in a graduate level operations research program, and over the years I did a lot of Bayesian stuff, but this has me stumped!

This is not a good sign on my part! Let me see if I can re-state the problem in a way that bridges the gap.
 
  • #28
There are analytic and linguistic issues lurking in this problem, and in a manner Odysseus would appreciate, it seems that I cannot sail past both of them.

Mehmood_Yasir said:
What is the probability that a randomly arriving padestrian has crossed the crossing in a group of k+1 ##k## padestrians?

note: I edited it to say ##k## for the slightly simpler problem to be discussed below.
- - - -
The crux of the problem is what does "randomly arriving" mean? Random with respect to what? I.e. what's the distribution? My read on @tnich 's approach was that it is a person averaged result. (Slight digression: the problem itself is actually quite close to computing the time averaged age, slow truck effect, etc. for general renewal problems, though the problem is one 'moment' down from such a thing, and it wants a specific outcome not the summed total. What follows is basically how I think about this for general renewals -- the argument should be able to be streamlined since the Poisson process is an idealized renewal.)

The technical issue is that the problem doesn't actually ask for a person averaged result. It says that there is a "randomly arriving" person and that implies a distribution. I believe the problem falls under the umbrella of problems called "random incidence". Depending on what you want this can make the problem intuitive or mathematically awkward.

The link between the person averaged result and a comparable distribution is the uniform distribution involving large enough n iterations. Depending on needs, this can be awkward because for any given problem, we always insist on large enough ##n## but can't actually pass a limit to it. If we must pass a limit, then the distribution, and in turn the question, collapses -- because the "randomly arriving" person doesn't have a distribution (or well defined process) so far as I can tell.

For what it's worth, I think that if if you want a uniform distribution, you just insist that ##n## is sufficiently large, and leave it as a parameter to optimize later on if needed -- in this case, if you are so inclined, you should be able to lazily do so via a Chebyshev bound, though it isn't needed.

- - - - -
Let me see if I can just directly state the result:

Supposing we have a large number of trials (i.e. sufficiently large ##n##), we can think of ourselves as randomly selecting some person who has arrived and making our selection "at random" -- meaning, I believe, uniformly at random.

My read on the question is that it wants to know the probability that the person we selected was in a group size of ##k##.

Before selecting a person "at random" we have a prior belief about the group size for any person we talk to and that prior belief is directly given by the Poisson distribution, since a Poisson process generates these arrivals.

##\text{prior} =
\begin{bmatrix}
\text{probability(group size 0, time length of t) }\\
\text{probability(group size 1, time length of t) }\\
\text{probability(group size 2, time length of t) }\\
\vdots\\
\text{probability(group size k, time length of t) }\\
\vdots\\
\end{bmatrix}
= \begin{bmatrix}
p_{\lambda}(k=0, t)\\
p_{\lambda}(k=1, t)\\
p_{\lambda}(k=2, t)\\
\vdots\\
p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

But the 'paradoxes' that come into play with these sorts of problems are because your sampling unit is by people, so the likelihood of selecting the person you are talking to is directly proportional to the group size that the person came in. This immediately gives the Likelihood function.

The result here should, to whatever desired precision, agree with the "person averaged" result from tnich. But the whole question comes down to what a "random arriving" person means and what sort of distribution that implies. The question itself is silent on this.

- - - -
simpler example:

To simplify the thought experiment, consider a modified Bernouli trial that always has 1 or 2 people in an arriving epoch with each outcome having ##\text{50%}## probability. Run the experiment many times, and put all those people in a "hat". Shuffle them. Reach into the hat and select someone "at random" i.e. off the top, and ask the person how big their group was.

You will have about twice as many people from the 2 sized group than from the 1 sized group in your 'hat' with the end result of a posterior distribution of ##\frac{1}{3}## of the people in the hat came from a group size of 1 and ##\frac{2}{3}## came from a group size of 2. This is your (posterior) probability estimate for the person chosen.
 
  • #29
StoneTemplePython said:
Supposing we have a large number of trials (i.e. sufficiently large ##n##), we can think of ourselves as randomly selecting some person who has arrived and making our selection "at random" -- meaning, I believe, uniformly at random.
I agree that the problem statement is subject to different interpretations. I read it as, "If I walk up to the intersection with no previous knowledge of the state of the process, what is the probability that the size of the group I cross with is k+1?" This does not seem to me to be the same as averaging over all of the people who cross over a large period of time. I think it is an average over my arrival time. Since this is a renewal process, it is possible to find this probability by looking at the limit of the average over ##n## renewal intervals as ##n\to \infty##.
 
  • Like
Likes StoneTemplePython
  • #30
StoneTemplePython said:
There are analytic and linguistic issues lurking in this problem, and in a manner Odysseus would appreciate, it seems that I cannot sail past both of them.
note: I edited it to say ##k## for the slightly simpler problem to be discussed below.
- - - -
The crux of the problem is what does "randomly arriving" mean? Random with respect to what? I.e. what's the distribution? My read on @tnich 's approach was that it is a person averaged result. (Slight digression: the problem itself is actually quite close to computing the time averaged age, slow truck effect, etc. for general renewal problems, though the problem is one 'moment' down from such a thing, and it wants a specific outcome not the summed total. What follows is basically how I think about this for general renewals -- the argument should be able to be streamlined since the Poisson process is an idealized renewal.)

The technical issue is that the problem doesn't actually ask for a person averaged result. It says that there is a "randomly arriving" person and that implies a distribution. I believe the problem falls under the umbrella of problems called "random incidence". Depending on what you want this can make the problem intuitive or mathematically awkward.

The link between the person averaged result and a comparable distribution is the uniform distribution involving large enough n iterations. Depending on needs, this can be awkward because for any given problem, we always insist on large enough ##n## but can't actually pass a limit to it. If we must pass a limit, then the distribution, and in turn the question, collapses -- because the "randomly arriving" person doesn't have a distribution (or well defined process) so far as I can tell.

For what it's worth, I think that if if you want a uniform distribution, you just insist that ##n## is sufficiently large, and leave it as a parameter to optimize later on if needed -- in this case, if you are so inclined, you should be able to lazily do so via a Chebyshev bound, though it isn't needed.

- - - - -
Let me see if I can just directly state the result:

Supposing we have a large number of trials (i.e. sufficiently large ##n##), we can think of ourselves as randomly selecting some person who has arrived and making our selection "at random" -- meaning, I believe, uniformly at random.

My read on the question is that it wants to know the probability that the person we selected was in a group size of ##k##.

Before selecting a person "at random" we have a prior belief about the group size for any person we talk to and that prior belief is directly given by the Poisson distribution, since a Poisson process generates these arrivals.

##\text{prior} =
\begin{bmatrix}
\text{probability(group size 0, time length of t) }\\
\text{probability(group size 1, time length of t) }\\
\text{probability(group size 2, time length of t) }\\
\vdots\\
\text{probability(group size k, time length of t) }\\
\vdots\\
\end{bmatrix}
= \begin{bmatrix}
p_{\lambda}(k=0, t)\\
p_{\lambda}(k=1, t)\\
p_{\lambda}(k=2, t)\\
\vdots\\
p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

But the 'paradoxes' that come into play with these sorts of problems are because your sampling unit is by people, so the likelihood of selecting the person you are talking to is directly proportional to the group size that the person came in. This immediately gives the Likelihood function.

The result here should, to whatever desired precision, agree with the "person averaged" result from tnich. But the whole question comes down to what a "random arriving" person means and what sort of distribution that implies. The question itself is silent on this.

- - - -
simpler example:

To simplify the thought experiment, consider a modified Bernouli trial that always has 1 or 2 people in an arriving epoch with each outcome having ##\text{50%}## probability. Run the experiment many times, and put all those people in a "hat". Shuffle them. Reach into the hat and select someone "at random" i.e. off the top, and ask the person how big their group was.

You will have about twice as many people from the 2 sized group than from the 1 sized group in your 'hat' with the end result of a posterior distribution of ##\frac{1}{3}## of the people in the hat came from a group size of 1 and ##\frac{2}{3}## came from a group size of 2. This is your (posterior) probability estimate for the person chosen.

The solution given by the OP (with help from tnich) is the type of thing I would have hoped to get from my students (or at least, the "A" students) on an assignment or take-home final. Basically, it is a renewal problem or a somewhat weird queueing problem, wherein a customer arriving to an empty queue opens a "gate" that stays open for time T and admits subsequent arrivals. The gate closes at time T, and the waiting customers immediately exit the system. This setup satisfies the requirements of PASTA ("Poisson Arrivals see Time Averages"), so in the long run an arriving customer sees what an average customer sees. That gives a nice "equilibrium" analysis that really emphasizes the system behavior, without the need for a Bayesian interpretation.
 
  • Like
Likes tnich and StoneTemplePython
  • #31
Ray Vickson said:
The solution given by the OP (with help from tnich) is the type of thing I would have hoped to get from my students (or at least, the "A" students) on an assignment or take-home final. Basically, it is a renewal problem or a somewhat weird queueing problem, wherein a customer arriving to an empty queue opens a "gate" that stays open for time T and admits subsequent arrivals. The gate closes at time T, and the waiting customers immediately exit the system. This setup satisfies the requirements of PASTA ("Poisson Arrivals see Time Averages"), so in the long run an arriving customer sees what an average customer sees. That gives a nice "equilibrium" analysis that really emphasizes the system behavior, without the need for a Bayesian interpretation.

It's probably just pattern recognition that led me down this road...

My read on the problem is, roughly, that you have exponential random variables ##Y_i## with parameter ##\lambda## and Poisson random variables ##X_i## with parameters ##t, \lambda##. If you run it forward you have a renewal process that goes

##X_1, Y_1, X_2, Y_2, ... ##

or equivalently ##X_i, Y_i, \to \text{renew}##

If you look at this as a process that counts time, you have

##\text{deterministic \{time = t }\}, Y_i \to \text{renew}##

however if you're interested in counting people, it reads

##X_i, \text{deterministic \{person = 1 }\} \to \text{renew}##

There are really two distinct dimensions embedded in here, and what is amusing is how depending on which feature you focus on, the other random variable becomes deterministic. For some reason, I honed in on the second case, but I recognize that typical renewal problems really are time denominated.

My approach should correspond to 'people averaging' whereas I infer that tnich took the first one of time averaging. I find it even more amusing that the two approaches give the same result.
 
  • #32
StoneTemplePython said:
It's probably just pattern recognition that led me down this road...

My read on the problem is, roughly, that you have exponential random variables ##Y_i## with parameter ##\lambda## and Poisson random variables ##X_i## with parameters ##t, \lambda##. If you run it forward you have a renewal process that goes

##X_1, Y_1, X_2, Y_2, ... ##

or equivalently ##X_i, Y_i, \to \text{renew}##

If you look at this as a process that counts time, you have

##\text{deterministic \{time = t }\}, Y_i \to \text{renew}##

however if you're interested in counting people, it reads

##X_i, \text{deterministic \{person = 1 }\} \to \text{renew}##

There are really two distinct dimensions embedded in here, and what is amusing is how depending on which feature you focus on, the other random variable becomes deterministic. For some reason, I honed in on the second case, but I recognize that typical renewal problems really are time denominated.

My approach should correspond to 'people averaging' whereas I infer that tnich took the first one of time averaging. I find it even more amusing that the two approaches give the same result.
But do they give the same result? I don't see that we have converged on the same result, yet.
 
  • #33
tnich said:
But do they give the same result? I don't see that we have converged on the same result, yet.

They do give the same results -- I just need to not be lazy, and allow the plus one adjustment.

Mehmood_Yasir said:
What is the probability that a randomly arriving padestrian has crossed the crossing in a group of ##k+1## padestrians...

The answer given is ##\frac{(K+1) {(\lambda * T)}^k e^{-\lambda T} } {k! (1+\lambda T)}##

##= \frac{(K+1)}{(1+\lambda T)} \frac{(\lambda T)^k e^{-\lambda T} }{k! } = \frac{(k+1)}{(1+\lambda t)} \Big(\frac{(\lambda t)^k e^{-\lambda t} }{k! }\Big) = \frac{(k+1)}{(1+\lambda t)} p_{\lambda}(k=k, t)##

(with a slight notation change along the way there)
- - - -
so the focus is on recovering ##\frac{(k+1)}{(1+\lambda t)} p_{\lambda}(k=k, t)## as the probability for a group size of ##k + 1##. Let's re-run my people oriented approach and include the plus one increment this time around.
- - - -
##\text{prior} =
\begin{bmatrix}
\text{P group size is 1}\\
\text{P group size is 2}\\
\text{P group size is 3}\\
\vdots\\
\text{P group size is k+1}\\
\vdots\\
\end{bmatrix}
=\begin{bmatrix}
p_{\lambda}(k=0, t)\\
p_{\lambda}(k=1, t)\\
p_{\lambda}(k=2, t)\\
\vdots\\
p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##

The justification is that you have a Poisson process -- except it is shifted by +1 via the deterministic random variable (##Y_i##) that must have a payoff of 1 person (and for avoidance of doubt remember that if we're interested in time, ##Y_i## has a finite first moment with respect to time).

Notice in particular that the prior probability associated with group size of ##(k+1)## is ##p_{\lambda}(k=k, t)##
- - - -

Now the new likelihood function looks a lot like the old one

##
\text{likelihood function} =
\begin{bmatrix}
1\\
2\\
3\\
\vdots\\
k+1\\
\vdots\\
\end{bmatrix}=
\begin{bmatrix}
0\\
1\\
2\\
\vdots\\
k\\
\vdots\\
\end{bmatrix} +
\begin{bmatrix}
1\\
1\\
1\\
\vdots\\
1\\
\vdots\\
\end{bmatrix}
##

and we now have

##\text{posterior} \propto \text{likelihood function} \circ \text{prior} = \begin{bmatrix}
(1)p_{\lambda}(k=0, t)\\
(2)p_{\lambda}(k=1, t)\\
(3)p_{\lambda}(k=2, t)\\
\vdots\\
(k+1)p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}
= \begin{bmatrix}
(0)p_{\lambda}(k=0, t)\\
(1)p_{\lambda}(k=1, t)\\
(2)p_{\lambda}(k=2, t)\\
\vdots\\
(k)p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix} +
\begin{bmatrix}
(1)p_{\lambda}(k=0, t)\\
(1)p_{\lambda}(k=1, t)\\
(1)p_{\lambda}(k=2, t)\\
\vdots\\
(1)p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}
##

Everything is real, non-negative so we can split the summation when looking for our normalizing constant --and we have the underlying power series of the exponential function, ensuring absolute convergence -- so we get a sum equal to
##\big(\lambda t \big) + \big(1\big) = \big(\lambda t + 1\big)##

-- i.e. when splitting the summation, we immediately recognize the expected value of the Poisson distribution and that the probabilities of a Poisson sum to one.

Putting this all together, we get:

##
\text{posterior} =
\begin{bmatrix}
\text{P group size is 1}\\
\text{P group size is 2}\\
\text{P group size is 3}\\
\vdots\\
\text{P group size is k+1}\\
\vdots\\
\end{bmatrix}
= \begin{bmatrix}
\frac{1}{\lambda t + 1} p_{\lambda}(k=0, t)\\
\frac{2}{\lambda t + 1}p_{\lambda}(k=1, t)\\
\frac{3}{\lambda t + 1}p_{\lambda}(k=2, t)\\
\vdots\\
\frac{k+1}{\lambda t+1}p_{\lambda}(k=k, t)\\
\vdots\\
\end{bmatrix}##
- - - -
in particular notice

##\text{P group size is k+1} = \frac{k+1}{\lambda t+1}p_{\lambda}(k=k, t)##

which is the result we were targeting.

- - - - -
edit:
a much later thought that can streamline a lot of this and put things in standard language:

the problem can be addressed, succinctly, as a renewal rewards process. Treating people in an arrival epoch as a discrete time random variable ##X##, we can see that ##X = W + 1##, where ##W## is poisson with parameter ##\lambda## and time ##t##. The process probabilistically starts over immediately after each arrival epoch

For rewards, we set up a reward of ##1## per person if they are in an arrival epoch with ##k+1## people and zero otherwise. Equivalently, we define the event ##A_n## where there are ##k+1## people in the nth arrival epoch and of course have an associated indicator random variable ##\mathbb I_{A_n}##, and the reward for epoch ##n## is given by ##R_n := \mathbb I_{A_n}\cdot X_n##

So we compute
##E\big[X_1\big] = E\big[W_1 \big] + 1 = \lambda t + 1##
##E\big[R_1\big] = E\big[\mathbb I_{A_1}\cdot X_1\big] = p_\lambda(k=k, t) \cdot (k+1) ##

but the basic renewal rewards theorems tell us

##\lim_{t \to \infty } \frac{r(t)}{t} = \frac{E[R_1]}{E[X_1]} = \frac{(k+1) \cdot p_\lambda(k=k, t)}{\lambda t + 1}=\lim_{t \to \infty } \frac{E[r(t)]}{t} ##

which is the answer
 
Last edited:
  • Like
Likes Mehmood_Yasir

1. What is probability and how does it relate to pedestrian wait time density function?

Probability is the measure of how likely an event is to occur. In the context of pedestrian wait time density function, it represents the likelihood of a pedestrian waiting a certain amount of time at a specific location.

2. How is pedestrian wait time density function calculated?

Pedestrian wait time density function is calculated by taking the ratio of the number of pedestrians waiting for a certain amount of time to the total number of pedestrians passing through a specific location in a given time period.

3. Can pedestrian wait time density function be used to predict wait times for pedestrians?

Yes, pedestrian wait time density function can be used to predict wait times for pedestrians. By analyzing the data and trends from the function, researchers can make predictions about future wait times for pedestrians at a specific location.

4. How does pedestrian behavior affect the accuracy of pedestrian wait time density function?

Pedestrian behavior can greatly affect the accuracy of pedestrian wait time density function. Factors such as the time of day, weather conditions, and the presence of other pedestrians can all impact the wait times and therefore affect the accuracy of the function.

5. What are some potential applications of using pedestrian wait time density function?

Pedestrian wait time density function can be used in various applications, such as urban planning, transportation management, and crowd control. It can help identify areas with high pedestrian wait times and inform decision-making to improve pedestrian flow and reduce congestion.

Similar threads

  • Calculus and Beyond Homework Help
2
Replies
56
Views
3K
  • Calculus and Beyond Homework Help
Replies
8
Views
674
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
237
  • Calculus and Beyond Homework Help
Replies
10
Views
982
  • Calculus and Beyond Homework Help
Replies
8
Views
985
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top