Pedestrian wait time pdf

In summary, the conversation discusses the arrival rate of pedestrians at a signal for crossing a road and the wait time of each pedestrian. The wait time of the first pedestrian is exactly T, while the wait time of subsequent pedestrians is dependent on the arrival time of the first pedestrian. The objective is to find the wait time density functions of all subsequent pedestrians after the first one.
  • #1
Mehmood_Yasir
68
2

Homework Statement


Pedestrian are arriving to a signal for crossing road with an arrival rate of ##\lambda## arrivals per minute. Whenever the first Pedestrian arrives at signal, he exactly waits for time ##T##, thus we say the first Pedestrian arrives at time ##0##. When time reaches ##T##, light flashes and all other Pedestrians who have arrived within ##T## also cross the road together with the first Pedestrian. Same process repeats.

What is the PDF of wait time of Pedestrians i.e, ##1^{st}##, ##2^{nd}##, ##3^{rd},...,##, pedestrians?

2. Homework Equations

Poisson arrival time density function is ##f_{X_k} (t)=\frac{{(\lambda t)}^{k} e^{-\lambda t}}{k!}=Erlang(k,\lambda; t)## distributed

The Attempt at a Solution


As said, the first pedestrian arrive at time 0 and exactly wait for ##T## time. After time ##T##, all pedestrian arrived at the crossing will cross the street together with the first one.

Now, only see the wait time of each pedestrian to find the density function of wait time.

As the first one wait exactly for time ##T## minutes, thus we can say its wait time density function is just an impulse function shifted at ##T## i.e., ##f_{W1} (t)=\delta (t-T)##.

Since the probability density function of arrival time of the second pedestrian following Poisson process is ##Erlang(1,\lambda)=e^{-\lambda t}## distributed, he waits for ##T-E[X_1|X_1<T]## where ##E[X_1|X_1<T]## is the mean value of the conditional arrival time of the second pedestrian given it is less than ##T##. What is the pdf of wait time of the second pedestrian? Can I say that the arrival time of the second pedestrian follows ##Erlang(1,\lambda; t)=e^{-\lambda t}## distribution, then the wait time pdf should be ##Erlang(1,\lambda; t)=e^{-\lambda (t-(T-E[X_1|X_1<T]))}##. Similarly, what about the third, fourth,... Pedestrians wait time density function?
 
Physics news on Phys.org
  • #2
Mehmood_Yasir said:
As said, the first pedestrian arrive at time 0 and exactly wait for ##T## time. After time ##T##, all pedestrian arrived at the crossing will cross the street together with the first one.

Now, only see the wait time of each pedestrian to find the density function of wait time.

As the first one wait exactly for time ##T## minutes, thus we can say its wait time density function is just an impulse function shifted at ##T## i.e., ##f_{W1} (t)=\delta (t-T)##.
I agree.

Mehmood_Yasir said:
Since the probability density function of arrival time of the second pedestrian following Poisson process is ##Erlang(1,\lambda)=e^{-\lambda t}## distributed, he waits for ##T-E[X_1|X_1<T]## where ##E[X_1|X_1<T]## is the mean value of the conditional arrival time of the second pedestrian given it is less than ##T##. What is the pdf of wait time of the second pedestrian? Can I say that the arrival time of the second pedestrian follows ##Erlang(1,\lambda; t)=e^{-\lambda t}## distribution, then the wait time pdf should be ##Erlang(1,\lambda; t)=e^{-\lambda (t-(T-E[X_1|X_1<T]))}##. Similarly, what about the third, fourth,... Pedestrians wait time density function?
I think the first question you need to answer is, are you finding the wait time of kth pedestrian, or of any pedestrian except the first one.

For the wait time of the second pedestrian, your assumption that the second pedestrian will wait for time ##T-E[X_1|X_1\leq T]## is not correct. Start by defining ##W_k## as the wait time for the kth pedestrian in terms of the arrival time time ##t_k##. Then find the ##P(W_k \leq w)##.
 
  • #3
tnich said:
I agree.I think the first question you need to answer is, are you finding the wait time of kth pedestrian, or of any pedestrian except the first one.

For the wait time of the second pedestrian, your assumption that the second pedestrian will wait for time ##T-E[X_1|X_1\leq T]## is not correct. Start by defining ##W_k## as the wait time for the kth pedestrian in terms of the arrival time time ##t_k##. Then find the ##P(W_k \leq w)##.

Just to be clear, for any particular value of ##\lambda##, let's assume ##m=4## more pedestrian arrived in time ##T## and all 5 (including the first) crossed the street after ##T##. Now the wait time of first is clear as T and thus also its wait time density function. My objective is to find the wait time density functions of all subsequent Pedestrians which are 2nd, 3rd, 4th and 5th in above assumption. In general speaking now this ##m## can take any value as it is unconditional in my problem. But I am atleast interested to find the density function of wait time of 2nd, 3rd, and 4th subsequent pedestrians after the first pedestrian. As you mentioned about wait time of any ##k^{th}## Pedestrian, than ##1<k\leq m##, referring wait time of all subsequent pedestrians except first, as first I already know.

What you asked
tnich said:
Start by defining ##W_k## as the wait time for the kth pedestrian in terms of the arrival time time ##t_k##. Then find the ##P(W_k \leq w)##.
I think ##P(W_k \leq w)=1-P(W_k > w)=1-e^{-\lambda w}##. Is it correct?
 
  • #4
Mehmood_Yasir said:
Just to be clear, for any particular value of ##\lambda##, let's assume ##m=4## more pedestrian arrived in time ##T## and all 5 (including the first) crossed the street after ##T##. Now the wait time of first is clear as T and thus also its wait time density function. My objective is to find the wait time density functions of all subsequent Pedestrians which are 2nd, 3rd, 4th and 5th in above assumption. In general speaking now this ##m## can take any value as it is unconditional in my problem. But I am atleast interested to find the density function of wait time of 2nd, 3rd, and 4th subsequent pedestrians after the first pedestrian. As you mentioned about wait time of any ##k^{th}## Pedestrian, than ##1<k\leq m##, referring wait time of all subsequent pedestrians except first, as first I already know.

What you asked

I think ##P(W_k \leq w)=1-P(W_k > w)=1-e^{-\lambda w}##. Is it correct?
What is the wait time of the kth pedestrian in terms of his arrival time?
 
  • #5
tnich said:
What is the wait time of the kth pedestrian in terms of his arrival time?
If ##t_k## is expected arrival time of ##k^{th}## pedestrian, then wait time is ##T-t_k##
 
  • #6
Mehmood_Yasir said:
If ##t_k## is expected arrival time of ##k^{th}## pedestrian, then wait time is ##T-t_k##
Right. And your next step is?
 
  • #7
tnich said:
Right. And your next step is?
Wait time density function of ##k^{th}## pedestrian.
 
  • #8
Mehmood_Yasir said:
Wait time density function of ##k^{th}## pedestrian.
Your next step is to substitute ##T - t_k## for ##W_k##.
 
  • #9
tnich said:
Your next step is to substitute ##T - t_k## for ##W_k##.
Do you mean,
##P(W_k \leq w)=1-P(W_k > w)=1-e^{-\lambda w}## by
##P((T-t_k) \leq t)=1-P((T-t_k) > t)=1-e^{-\lambda t}##
 
  • #10
Mehmood_Yasir said:
Do you mean,
##P(W_k \leq w)=1-P(W_k > w)=1-e^{-\lambda w}## by
##P((T-t_k) \leq t)=1-P((T-t_k) > t)##
It's good up to here. Now use can use the distribution of ##t_k## to get the distribution of ##W_k##.
 
  • #11
tnich said:
It's good up to here. Now use can use the distribution of ##t_k## to get the distribution of ##W_k##.
is ##t_k## density function distributed as ##Erlang(k,\lambda;t)= \frac{{(\lambda t)}^{k} e^{-\lambda t}} {k!}##
 
  • #12
Mehmood_Yasir said:
is ##t_k## density function distributed as ##Erlang(k,\lambda;t)= \frac{{(\lambda t)}^{k} e^{-\lambda t}} {k!}##
Yes, but remember that is the Erlang pdf, not the cdf.
 
  • #13
tnich said:
Yes, but remember that is the Erlang pdf, not the cdf.
for the cdf, it will be ##F_{t_k}=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^{m} e^{-\lambda t}} {m!}##
but I am at the moment looking to find the pdf of the wait time, so if the pdf of the arrival time ##t_k## is ##\frac{{(\lambda t)}^k e^{-\lambda t}} {k!}##, is pdf of ##W_k=T-t_k## is something like shifted arrival pdf .. I am not sure for this actually. ##f_{W_k}=\frac{{(\lambda (T-t))}^k e^{-\lambda (T-t)}} {k!}##, is it correct?

and resultant cdf as ##F_{W_k}= 1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-t))}^{m} e^{-\lambda (T-t)}} {m!}##
 
  • #14
Mehmood_Yasir said:
for the cdf, it will be ##F_{t_k}=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^{m} e^{-\lambda t}} {m!}##
but I am at the moment looking to find the pdf of the wait time, so if the pdf of the arrival time ##t_k## is ##\frac{{(\lambda t)}^k e^{-\lambda t}} {k!}##, is pdf of ##W_k=T-t_k## is something like shifted arrival pdf .. I am not sure for this actually. ##f_{W_k}=\frac{{(\lambda (T-t))}^k e^{-\lambda (T-t)}} {k!}##, is it correct?

and resultant cdf as ##F_{W_k}= 1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-t))}^{m} e^{-\lambda (T-t)}} {m!}##
You keep making guesses, but guessing rarely gives the right answer. I suggest that you work the problem through to a solution and then check to make sure the solution makes sense. Then you will know you have the right answer.

So here is a method for getting the pdf of a random variable U given the cdf of another random variable v
1) Express ##V## as a function of ##U##: ##V = g(U)##

2) Express the probability of ##U## in terms of the probability of ##V##:
##P(U\leq u)##
##=P(g^{-1}(V)\leq u)=\begin{cases}
P(V\leq g(u)) & \text{if } V=g(U) \text{ is an increasing function of } U \\
P(V\geq g(u)) & \text{if } V=g(U) \text{ is a decreasing function of } U
\end{cases}##
(Why do you need to consider whether g(U) is increasing or decreasing?)

3) Given that ##F(v)## is the cdf of ##V##, substitute ##F(g(u))## for ##P(V\leq g(u))## in your equation for ##P(U\leq u)##. (If ##g(U)## is decreasing, you may need to express ##P(V\geq g(u))## in terms of ##P(V\leq g(u))## first.)

4) Differentiate ##P(U\leq u)## with respect to ##u## to get a function h(u).

5) Integrate ##h(u)## over the range of ##u## to get a constant
##C=\int_{u_{min}}^{u_{max}}h(u)##

If ##C = 1##, then ##h(u)## is your pdf. If not, then ##\frac 1 C h(u)## is your pdf. (Why?)
 
  • #15
tnich said:
You keep making guesses, but guessing rarely gives the right answer. I suggest that you work the problem through to a solution and then check to make sure the solution makes sense. Then you will know you have the right answer.

So here is a method for getting the pdf of a random variable U given the cdf of another random variable v
1) Express ##V## as a function of ##U##: ##V = g(U)##

2) Express the probability of ##U## in terms of the probability of ##V##:
##P(U\leq u)##
##=P(g^{-1}(V)\leq u)=\begin{cases}
P(V\leq g(u)) & \text{if } V=g(U) \text{ is an increasing function of } U \\
P(V\geq g(u)) & \text{if } V=g(U) \text{ is a decreasing function of } U
\end{cases}##
(Why do you need to consider whether g(U) is increasing or decreasing?)

3) Given that ##F(v)## is the cdf of ##V##, substitute ##F(g(u))## for ##P(V\leq g(u))## in your equation for ##P(U\leq u)##. (If ##g(U)## is decreasing, you may need to express ##P(V\geq g(u))## in terms of ##P(V\leq g(u))## first.)

4) Differentiate ##P(U\leq u)## with respect to ##u## to get a function h(u).

5) Integrate ##h(u)## over the range of ##u## to get a constant
##C=\int_{u_{min}}^{u_{max}}h(u)##

If ##C = 1##, then ##h(u)## is your pdf. If not, then ##\frac 1 C h(u)## is your pdf. (Why?)

let me see again to get to the answer of pdf of ##W_k##
 
  • #16
You keep making guesses, but guessing rarely gives the right answer. I suggest that you work the problem through to a solution and then check to make sure the solution makes sense. Then you will know you have the right answer.

So here is a method for getting the pdf of a random variable U given the cdf of another random variable v
say U ##\to## ##W_k##, V ##\to## ##t_k##, u ##\to## ##w##, v ##\to## ##t##
1) Express ##V## as a function of ##U##: ##V = g(U)##
##W_k=T-t_k##
##t_k=T-W_k##

2) Express the probability of ##U## in terms of the probability of ##V##:
##P(U\leq u)##
##=P(g^{-1}(V)\leq u)=\begin{cases}
P(V\leq g(u)) & \text{if } V=g(U) \text{ is an increasing function of } U \\
P(V\geq g(u)) & \text{if } V=g(U) \text{ is a decreasing function of } U
\end{cases}##
##P(W_k \leq w)=P(T-t_k \leq w)=1-P(T-t_k > w)##

2)
(Why do you need to consider whether g(U) is increasing or decreasing?)
Should be increasing as cdf is always increasing function.

3) Given that ##F(v)## is the cdf of ##V##,
##F(v)=F(t)=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^m e^{-\lambda t}}{m!}##,

substitute ##F(g(u))## for ##P(V\leq g(u))## in your equation for ##P(U\leq u)##. (If ##g(U)## is decreasing, you may need to express ##P(V\geq g(u))## in terms of ##P(V\leq g(u))## first.)
##F(g(u))=F(T-w)= 1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!} ##

4) Differentiate ##P(U\leq u)## with respect to ##u## to get a function h(u).
## \frac{d}{dw} P(W_k \leq w)= \frac{d}{dw} P(T-t_k \leq w)=\frac{d}{dw} (1-P(T-t_k > w))##
## =\frac{d}{dw} (1-e^{-\lambda w})##
## =\lambda e^{-\lambda w}##

5) Integrate ##h(u)## over the range of ##u## to get a constant
##C=\int_{u_{min}}^{u_{max}}h(u)##
##C=\int_{0}^{T}\lambda e^{-\lambda w} dw##
##C=1- e^{-\lambda T}##

If ##C = 1##, then ##h(u)## is your pdf. If not, then ##\frac 1 C h(u)## is your pdf. (Why?)
##\frac 1 C## is normalization such that pdf should sum up to 1.
so pdf of ##W## is ## f_W=\frac{ \lambda e^{-\lambda w} } {1- e^{-\lambda T}}##
## f_W=\frac{ \lambda e^{-\lambda (T-t)} } {1- e^{-\lambda T}}##
 
  • #17
Mehmood_Yasir said:
say U ##\to## ##W_k##, V ##\to## ##t_k##, u ##\to## ##w##, v ##\to## ##t##

##W_k=T-t_k##
##t_k=T-W_k####P(W_k \leq w)=P(T-t_k \leq w)=1-P(T-t_k > w)##
The point is to get an equation for cdf of ##W_k## in terms of the cdf of ##t_k##.
You are not quite there yet. You need to end up with an expression with a term like ##P(t_k < \text{ something})## in it.

Mehmood_Yasir said:
Should be increasing as cdf is always increasing function.
g(u) is not a cdf. It is the relationship between u and v. The point of this question is why do you need to say ##P(W_k \leq w)=1-## something?

Mehmood_Yasir said:
##F(v)=F(t)=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^m e^{-\lambda t}}{m!}##,

##F(g(u))=F(T-w)= 1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!} ##
This will not be right until you get step 2) right.
Mehmood_Yasir said:
## \frac{d}{dw} P(W_k \leq w)= \frac{d}{dw} P(T-t_k \leq w)=\frac{d}{dw} (1-P(T-t_k > w))##
## =\frac{d}{dw} (1-e^{-\lambda w})##
## =\lambda e^{-\lambda w}##
Here, you need to differentiate the result of step 3) (when you get it figured out).
Mehmood_Yasir said:
##C=\int_{0}^{T}\lambda e^{-\lambda w} dw##
##C=1- e^{-\lambda T}####\frac 1 C## is normalization such that pdf should sum up to 1.
so pdf of ##W## is ## f_W=\frac{ \lambda e^{-\lambda w} } {1- e^{-\lambda T}}##
## f_W=\frac{ \lambda e^{-\lambda (T-t)} } {1- e^{-\lambda T}}##
You have the right idea here but you are applying it to the wrong distribution.
 
  • Like
Likes BvU
  • #18
tnich said:
The point is to get an equation for cdf of ##W_k## in terms of the cdf of ##t_k##.
You are not quite there yet. You need to end up with an expression with a term like ##P(t_k < \text{ something})## in it.
##P(W_k \leq w)=P(T-t_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
##P(W_k \leq w)=1-P(t_k < T-w)##
Rest., I see again.
 
  • #19
tnich said:
You keep making guesses, but guessing rarely gives the right answer. I suggest that you work the problem through to a solution and then check to make sure the solution makes sense. Then you will know you have the right answer.

So here is a method for getting the pdf of a random variable U given the cdf of another random variable v
1) Express ##V## as a function of ##U##: ##V = g(U)##
##t_k=T-W_k##
2) Express the probability of ##U## in terms of the probability of ##V##:
##P(U\leq u)##
##=P(g^{-1}(V)\leq u)=\begin{cases}
P(V\leq g(u)) & \text{if } V=g(U) \text{ is an increasing function of } U \\
P(V\geq g(u)) & \text{if } V=g(U) \text{ is a decreasing function of } U
\end{cases}##
##P(W_k \leq w)=P(T-t_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
##t_k## is decreasing function in the interval ##0## to ##T## as ##W_k## is increasing, ##t_k## decreases. So,
##P(W_k \leq w)=P(t_k \geq T-w)##

3) Given that ##F(v)## is the cdf of ##V##,
##F(v)=F(t)=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^m e^{-\lambda t}}{m!}##,

substitute ##F(g(u))## for ##P(V\leq g(u))## in your equation for ##P(U\leq u)##. (If ##g(U)## is decreasing, you may need to express ##P(V\geq g(u))## in terms of ##P(V\leq g(u))## first.)
I am a bit stuck in this part,
##v=g(u)##
##t=T-w##
##F(g(u))=1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
now
##P(U\leq u)=P(W_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
since ##t_k## is decreasing
##P(W_k \leq w)=1-P(t_k < T-w)##
##P(W_k \leq w)=1-(1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!})##
##P(W_k \leq w)=\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
 
  • #20
Mehmood_Yasir said:
##t_k=T-W_k##

##P(W_k \leq w)=P(T-t_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
##t_k## is decreasing function in the interval ##0## to ##T## as ##W_k## is increasing, ##t_k## decreases. So,
##P(W_k \leq w)=P(t_k \geq T-w)####F(v)=F(t)=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^m e^{-\lambda t}}{m!}##,I am a bit stuck in this part,
##v=g(u)##
##t=T-w##
##F(g(u))=1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
now
##P(U\leq u)=P(W_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
since ##t_k## is decreasing
##P(W_k \leq w)=1-P(t_k < T-w)##
##P(W_k \leq w)=1-(1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!})##
##P(W_k \leq w)=\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
Mehmood_Yasir said:
##t_k=T-W_k##

##P(W_k \leq w)=P(T-t_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
##t_k## is decreasing function in the interval ##0## to ##T## as ##W_k## is increasing, ##t_k## decreases. So,
##P(W_k \leq w)=P(t_k \geq T-w)####F(v)=F(t)=1-\sum_{m=0}^{k-1} \frac{{(\lambda t)}^m e^{-\lambda t}}{m!}##,I am a bit stuck in this part,
##v=g(u)##
##t=T-w##
##F(g(u))=1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
now
##P(U\leq u)=P(W_k \leq w)##
##P(W_k \leq w)=P(t_k \geq T-w)##
since ##t_k## is decreasing
##P(W_k \leq w)=1-P(t_k < T-w)##
##P(W_k \leq w)=1-(1-\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!})##
##P(W_k \leq w)=\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
That's what I got, too, so I think you are on the right track. Keep going. Differentiate your result.
 
  • #21
tnich said:
That's what I got, too, so I think you are on the right track. Keep going. Differentiate your result.
derivative with respect to w to get ##h(u)##, getting too complex
## =\frac{d}{dw}P(W_k \leq w)##
##=\frac{d}{dw} \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
##= \sum_{m=0}^{k-1} \frac{d}{dw} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
##= \sum_{m=0}^{k-1} \frac{1}{m!} \frac{d}{dw} {(\lambda (T-w))}^m e^{-\lambda (T-w)}##
using product rule.
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)} + {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
##= \sum_{m=0}^{k-1} \frac{\lambda(T-w) - m}{m!} \Big(\lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)} \Big)##
 
Last edited:
  • #22
Mehmood_Yasir said:
derivative with respect to w to get ##h(u)##, getting too complex
## =\frac{d}{dw}P(W_k \leq w)##
##=\frac{d}{dw} \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
##= \sum_{m=0}^{k-1} \frac{d}{dw} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
##= \sum_{m=0}^{k-1} \frac{1}{m!} \frac{d}{dw} {(\lambda (T-w))}^m e^{-\lambda (T-w)}##
using product rule.
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)} + {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
Try stopping here and multiplying through by ##\frac 1 {m!}##. You should find that most of the terms of the summation cancel out.
 
  • #23
tnich said:
Try stopping here and multiplying through by ##\frac 1 {m!}##. You should find that most of the terms of the summation cancel out.
Also, think about the term for ##m=0##. What is ##\frac d {dw} \frac{{(\lambda (T-w))}^0 e^{-\lambda (T-w)}}{0!}##?
 
  • #24
tnich said:
Also, think about the term for ##m=0##. What is ##\frac d {dw} \frac{{(\lambda (T-w))}^0 e^{-\lambda (T-w)}}{0!}##?
##\lambda e^{-\lambda (T-w)}##
 
  • #25
tnich said:
Try stopping here and multiplying through by ##\frac 1 {m!}##. You should find that most of the terms of the summation cancel out.
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)} + {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
I see that only ##m## in the first term is cancelled
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
 
  • #26
Mehmood_Yasir said:
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)} + {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
I see that only ##m## in the first term is cancelled
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
What ## \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big)## when ##m = p## and
##\frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)## when ##m = p-1##?

Oh, and you have made a mistake on this part ##{(\lambda^m (T-w))}^m##.
 
  • Like
Likes Mehmood_Yasir
  • #27
tnich said:
Also, think about the term for ##m=0##. What is ##\frac d {dw} \frac{{(\lambda (T-w))}^0 e^{-\lambda (T-w)}}{0!}##?
I think you term should have another ##\lambda## in numerator for ##m=0##
##\frac{d}{dw} \frac{{(\lambda (T-w))}^0 \lambda e^{-\lambda (T-w)}}{0!}##?
##=\lambda^2 e^{-\lambda (T-w)} ##
 
  • #28
Mehmood_Yasir said:
I think you term should have another ##\lambda## in numerator for ##m=0##
##\frac{d}{dw} \frac{{(\lambda (T-w))}^0 \lambda e^{-\lambda (T-w)}}{0!}##?
##=\lambda^2 e^{-\lambda (T-w)} ##
##x^0=##?
 
  • #29
Mehmood_Yasir said:
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)} + {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
##= \sum_{m=0}^{k-1} \frac{1}{m!} \Big(-m \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
I see that only ##m## in the first term is cancelled
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda^m (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
after correcting the typo,
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
You are right if I open the series,
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \sum_{m=0}^{k-1} \frac{1}{m!} \Big( {(\lambda (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##

##= - \Big(0+\lambda e^{-\lambda (T-w)} +\lambda^2 (T-w) e^{-\lambda (T-w)}+...+ {\lambda^{k-1}{(T-w)}^{k-2} e^{-\lambda (T-w)}} {(k-2)!} \Big) + \Big( \lambda e^{-\lambda (T-w)} +\lambda^2 (T-w) e^{-\lambda (T-w)}+...+ {\lambda^{k-1}{(T-w)}^{k-2} e^{-\lambda (T-w)}} {(k-2)!} + {\lambda^{k}{(T-w)}^{k-1} e^{-\lambda (T-w)}} {(k-1)!}\Big)##
so we are left only with this nicely,
##= {\lambda^{k}{(T-w)}^{k-1} e^{-\lambda (T-w)}} {(k-1)!}\Big)##
 
  • #30
tnich said:
##x^0=##?
Oh, I see what you mean. You think I left out a λ. But I am just taking the 0th term from ##\frac{d}{dw} \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##. There is no extra λ there.
Mehmood_Yasir said:
after correcting the typo,
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \frac{1}{m!} \Big( {(\lambda (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##
You are right if I open the series,
##= \sum_{m=0}^{k-1} \frac{1}{(m-1)!} \Big(- \lambda^m {(T-w)}^{m-1}e^{-\lambda (T-w)}\Big) + \sum_{m=0}^{k-1} \frac{1}{m!} \Big( {(\lambda (T-w))}^m \lambda e^{-\lambda (T-w)} \Big)##

##= - \Big(0+\lambda e^{-\lambda (T-w)} +\lambda^2 (T-w) e^{-\lambda (T-w)}+...+ {\lambda^{k-1}{(T-w)}^{k-2} e^{-\lambda (T-w)}} {(k-2)!} \Big) + \Big( \lambda e^{-\lambda (T-w)} +\lambda^2 (T-w) e^{-\lambda (T-w)}+...+ {\lambda^{k-1}{(T-w)}^{k-2} e^{-\lambda (T-w)}} {(k-2)!} + {\lambda^{k}{(T-w)}^{k-1} e^{-\lambda (T-w)}} {(k-1)!}\Big)##
so we are left only with this nicely,
##= {\lambda^{k}{(T-w)}^{k-1} e^{-\lambda (T-w)}} {(k-1)!}\Big)##
Yes, except you left off \frac. Now you need to normalize. You can also think of this as finding ##\frac d {dw} P(W_k \leq w | W_k \leq T)##. (Why?)
 
  • Like
Likes Mehmood_Yasir
  • #31
tnich said:
Oh, I see what you mean. You think I left out a λ. But I am just taking the 0th term from ##\frac{d}{dw} \sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##. There is no extra λ there.
That's correct.
Yes, except you left off \frac.
O sorry. Actual result is ##= \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}##

Now you need to normalize. You can also think of this as finding ##\frac d {dw} P(W_k \leq w | W_k \leq T)##. (Why?)
Yes, ##= \frac{ \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!} }{1-P(W_k>T)} ##
##= \frac{ \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!} }{(1-e^{-\lambda T)}} ##

because ##W_k## can never be greater than ##T## and can have a maximum value of ##T##
 
  • #32
Mehmood_Yasir said:
That's correct.

O sorry. Actual result is ##= \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!}##Yes, ##= \frac{ \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!} }{1-P(W_k>T)} ##
##= \frac{ \frac{ \lambda^k {(T-w)}^{k-1} e^{-\lambda (T-w)} }{(k-1)!} }{(1-e^{-\lambda T)}} ##
for plotting pdf of ##t_k## and pdf of ##W_k## against a time vector e.g., ##t=0:0.01:T##(minutes) on x axis, I need to replace ##w## in pdf of ##W_k## with ##t## such that

##f_{W_k} (t)= \frac{ \frac{ \lambda^k {(T-t)}^{k-1} e^{-\lambda (T-t)} }{(k-1)!} }{(1-e^{-\lambda T)}} ##

because ##W_k## can never be greater than ##T## and can have a maximum value of ##T##
Again thank you very much@tnich
 
Last edited:
  • #33
Mehmood_Yasir said:
for plotting pdf of ##t_k## and pdf of ##W_k## against a time vector e.g., ##t=0:0.01:T##(minutes) on x axis, I need to replace ##w## in pdf of ##W_k## with ##t## such that

##f_{W_k} (t)= \frac{ \frac{ \lambda^k {(T-t)}^{k-1} e^{-\lambda (T-t)} }{(k-1)!} }{(1-e^{-\lambda T)}} ##
That is not correct. From post #20, what is ##P(W_k\leq w)##?
 
  • #34
tnich said:
That is not correct. From post #20, what is ##P(W_k\leq w)##?

##P(W_k \leq w)=1-P(t_k < T-w)##
##P(W_k \leq w)=\sum_{m=0}^{k-1} \frac{{(\lambda (T-w))}^m e^{-\lambda (T-w)}}{m!}##
 
  • #35
tnich said:
That is not correct. From post #20, what is ##P(W_k\leq w)##?
I am a bit confused as I am plotting the pdf of ##W_k## and ##t_k## in MATLAB for the above process. For the time vector ##t## on X-axis, I plot the pdf of ##t_k## using its density function ##f_{t_k} (t)= \frac{ {(\lambda t)}^k e^{-\lambda t}} {k!}## for time vector ##0\leq t \leq T##. Thats right as it should be. Now for plotting pdf of ##W_k## using its density function ##f_{W_k} (w)## as given earlier in terms of ##w##. considering ##0\leq t\leq T##, what is ##w## in ##f_{W_k} (w)## in terms of ##t##. Should ##w## in ##f_{W_k} (w)## must be replaced with ##T-t## to plot against ##t##.
 

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
32
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
677
  • Calculus and Beyond Homework Help
Replies
10
Views
987
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
13
Views
1K
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
12
Views
2K
Back
Top