Calculating Transition Probabilities & Expected Values of a Markov Bus Chain

  • #1

O_o

32
3

Homework Statement


-A bus is moving along an infinite route and has stops at n = 0, 1, 2, 3, ...
-The bus can hold up to B people
-The number of people waiting at stop n, Yn, is distributed Poisson(10) and is independent from the number waiting at all the other stops.
-At any given stop each passenger has probability p of getting off and the passengers are mutually independent.
-Assume after the passengers get off at stop n, everyone waiting at that stop gets on as long as there are spots remaining
[tex] \{X_n; n \geq 0\} [/tex] is a Markov Chain representing the number of passengers on the bus after it leaves stop n.


1) What are the transition probabilities, pij?
2) What is [tex] E[X_n] [/tex] for n = 0, 1, 2, ..., 20 if p = 0.4 and B = 20

Homework Equations




The Attempt at a Solution


(1)
[tex]X_n = \min\{X_{n-1} - L_n + Y_n, B\}[/tex]
Where L_n ~ Binomial(X_{n-1}, p) is the number of people who got off at stop n

The transition probabilities need to be broken up into cases. For i > 0, j < i [tex] p_{i,j} = \sum_{k=i-j}^i {i \choose k} p^k(1-p)^{i-k} \left( \frac{10^{j-i+k} e^{-10}}{(j-i+k)!}\right)[/tex]

for i>0, B > j >= i [tex]
p_{i,j} = \sum_{k=0}^i {i \choose k} p^k(1-p)^{i-k} \left( \frac{10^{j-i+k} e^{-10}}{(j-i+k)!}\right)[/tex]

for i>=0 j = B[tex]
p_{i,j} = \sum_{k=0}^i {i \choose k} p^k(1-p)^{i-k} \left(\sum_{h=j-i+k}^{\infty}\frac{10^{h} e^{-10}}{(h)!}\right)[/tex]



(2)
I feel reasonably okay about question (1) although maybe it's completely wrong. My main concern was part 2. If I want to use [tex]X_n = \min\{X_{n-1} - L_n + Y_n, B\} [/tex] to find the expected value, I get this [tex] E[X_n] =\min\{E[X_{n-1}] - E[L_n] + E[Y_n], B\}[/tex]

which seems like it should be doable Except L_n ~ binomial (X_{n-1}, p) so the parameter is a random variable. Am I allowed to say[tex]E[L_n] = E[X_{n-1}]p[/tex]

We haven't learned anything about having random variables in sum boundaries so I'm a bit clueless. Thank you.
 
Last edited:
  • #2

Homework Statement


-A bus is moving along an infinite route and has stops at n = 0, 1, 2, 3, ...
-The bus can hold up to B people
-The number of people waiting at stop n, Yn, is distributed Poisson(10) and is independent from the number waiting at all the other stops.
-At any given stop each passenger has probability p of getting off and the passengers are mutually independent.
-Assume after the passengers get off at stop n, everyone waiting at that stop gets on as long as there are spots remaining
[tex] \{X_n; n \geq 0\} [/tex] is a Markov Chain representing the number of passengers on the bus after it leaves stop n.


1) What are the transition probabilities, pij?
2) What is [tex] E[X_n] [/tex] for n = 0, 1, 2, ..., 20 if p = 0.4 and B = 20

Homework Equations




The Attempt at a Solution


(1)
[tex]X_n = \min\{X_{n-1} - L_n + Y_n, B\}[/tex]
Where L_n ~ Binomial(X_{n-1}, p) is the number of people who got off at stop n

The transition probabilities need to be broken up into cases. For i > 0, j < i [tex] p_{i,j} = \sum_{k=i-j}^i {i \choose k} p^k(1-p)^{i-k} \left( \frac{10^{j-i+k} e^{-10}}{(j-i+k)!}\right)[/tex]

for i>0, B > j >= i [tex]
p_{i,j} = \sum_{k=0}^i {i \choose k} p^k(1-p)^{i-k} \left( \frac{10^{j-i+k} e^{-10}}{(j-i+k)!}\right)[/tex]

for i>=0 j = B[tex]
p_{i,j} = \sum_{k=0}^i {i \choose k} p^k(1-p)^{i-k} \left(\sum_{h=j-i+k}^{\infty}\frac{10^{h} e^{-10}}{(h)!}\right)[/tex]



(2)
I feel reasonably okay about question (1) although maybe it's completely wrong. My main concern was part 2. If I want to use [tex]X_n = \min\{X_{n-1} - L_n + Y_n, B\} [/tex] to find the expected value, I get this [tex] E[X_n] =\min\{E[X_{n-1}] - E[L_n] + E[Y_n], B\}[/tex]

which seems like it should be doable Except L_n ~ binomial (X_{n-1}, p) so the parameter is a random variable. Am I allowed to say[tex]E[L_n] = E[X_{n-1}]p[/tex]

We haven't learned anything about having random variables in sum boundaries so I'm a bit clueless. Thank you.

I guess in part (2) you are supposed to use the numerical values of B and p to get a numerical ##21 \times 21## one-step transition matrix ##P = \left( p_{i,j} \right) ##. Then, if you are given a starting state ##X_0 \in \{ 0, \ldots, 20 \}##, say ##X_0 = i_0##, the distribution of ##X_n## is just row ##i_0## of the matrix ##P^n##, so the expected value is readily computable. This is more easily obtained by starting with initial state probability (row) vector ##v_0 = (0,0, \ldots,1,0 \ldots, 0)##, where the 1 is in position ##i_0##. Then the row vector ##v_n## of state probabilities at time ##n## is given by ##v_n = v_{n-1} P##, so we can get ##v_1, v_2, \ldots, v_{20}## recursively, without the need to compute the whole matrix ##P^n## for each ##n##. In general, the values of ##EX_n = \sum_{k=0}^{20} k v_n(k)## will depend on the starting state ##i_0##.
 

Suggested for: Calculating Transition Probabilities & Expected Values of a Markov Bus Chain

Replies
12
Views
711
Replies
1
Views
569
Replies
6
Views
626
Replies
1
Views
807
Replies
1
Views
354
Replies
9
Views
808
Replies
5
Views
658
Replies
17
Views
659
Replies
15
Views
985
Replies
7
Views
700
Back
Top