1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov Chain

  1. Jan 28, 2015 #1

    O_o

    User Avatar

    1. The problem statement, all variables and given/known data
    -A bus is moving along an infinite route and has stops at n = 0, 1, 2, 3, ......
    -The bus can hold up to B people
    -The number of people waiting at stop n, Yn, is distributed Poisson(10) and is independent from the number waiting at all the other stops.
    -At any given stop each passenger has probability p of getting off and the passengers are mutually independent.
    -Assume after the passengers get off at stop n, everyone waiting at that stop gets on as long as there are spots remaining
    [tex] \{X_n; n \geq 0\} [/tex] is a Markov Chain representing the number of passengers on the bus after it leaves stop n.


    1) What are the transition probabilities, pij?
    2) What is [tex] E[X_n] [/tex] for n = 0, 1, 2, ..., 20 if p = 0.4 and B = 20

    2. Relevant equations


    3. The attempt at a solution
    (1)
    [tex]X_n = \min\{X_{n-1} - L_n + Y_n, B\}[/tex]
    Where L_n ~ Binomial(X_{n-1}, p) is the number of people who got off at stop n

    The transition probabilities need to be broken up into cases. For i > 0, j < i [tex] p_{i,j} = \sum_{k=i-j}^i {i \choose k} p^k(1-p)^{i-k} \left( \frac{10^{j-i+k} e^{-10}}{(j-i+k)!}\right)[/tex]

    for i>0, B > j >= i [tex]
    p_{i,j} = \sum_{k=0}^i {i \choose k} p^k(1-p)^{i-k} \left( \frac{10^{j-i+k} e^{-10}}{(j-i+k)!}\right)[/tex]

    for i>=0 j = B[tex]
    p_{i,j} = \sum_{k=0}^i {i \choose k} p^k(1-p)^{i-k} \left(\sum_{h=j-i+k}^{\infty}\frac{10^{h} e^{-10}}{(h)!}\right)[/tex]



    (2)
    I feel reasonably okay about question (1) although maybe it's completely wrong. My main concern was part 2. If I want to use [tex]X_n = \min\{X_{n-1} - L_n + Y_n, B\} [/tex] to find the expected value, I get this [tex] E[X_n] =\min\{E[X_{n-1}] - E[L_n] + E[Y_n], B\}[/tex]

    which seems like it should be doable Except L_n ~ binomial (X_{n-1}, p) so the parameter is a random variable. Am I allowed to say[tex]E[L_n] = E[X_{n-1}]p[/tex]

    We haven't learned anything about having random variables in sum boundaries so I'm a bit clueless. Thank you.
     
    Last edited: Jan 28, 2015
  2. jcsd
  3. Jan 28, 2015 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    I guess in part (2) you are supposed to use the numerical values of B and p to get a numerical ##21 \times 21## one-step transition matrix ##P = \left( p_{i,j} \right) ##. Then, if you are given a starting state ##X_0 \in \{ 0, \ldots, 20 \}##, say ##X_0 = i_0##, the distribution of ##X_n## is just row ##i_0## of the matrix ##P^n##, so the expected value is readily computable. This is more easily obtained by starting with initial state probability (row) vector ##v_0 = (0,0, \ldots,1,0 \ldots, 0)##, where the 1 is in position ##i_0##. Then the row vector ##v_n## of state probabilities at time ##n## is given by ##v_n = v_{n-1} P##, so we can get ##v_1, v_2, \ldots, v_{20}## recursively, without the need to compute the whole matrix ##P^n## for each ##n##. In general, the values of ##EX_n = \sum_{k=0}^{20} k v_n(k)## will depend on the starting state ##i_0##.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Markov Chain
  1. Markov Chains (Replies: 9)

  2. Markov chain (Replies: 0)

  3. Markov chains (Replies: 0)

  4. Markov Chain (Replies: 1)

  5. Markov chains (Replies: 0)

Loading...