Markov Chain

  1. Please Help! Markov Chain

    1. The problem statement, all variables and given/known data
    Let X0 be a random variable with values in a countable set I. Let Y1, Y2, ... be a
    sequence of independent random variables, uniformly distributed on [0, 1].
    Suppose we are given a function G : I x [0, 1] -> I
    and define inductively for n >= 0, Xn+1 = G(Xn, Yn+1).
    Show that (Xn)n>=0 is a Markov chain and
    express its transition matrix P in terms of G.



    2. Relevant equations



    3. The attempt at a solution
    I know that I need to show that Xn+1 depends on Xn by checking the condition in the definition of Markov chain, and then
    try to find some formula for P(Xn+1 = j | Xn=i) in terms of G.

    Actually, my background for Markov chain lacks a little, so I have no how I find some formula for P in terms of G.. How do I handle terms of G?
    Anybody give me some hints or answer?
     
  2. jcsd
  3. fi1

    fi1 3

    Re: Please Help! Markov Chain

    I guess you should only need to show P[Xn+1 = xn+1 | X1 = x1, X2 = x2, ..., Xn = xn] = P[Xn+1 = xn+1 | Xn = xn]. (Markov property).

    In your case it holds, since G is only a function of Xn; and Yi are independent random variables. P[Xn+1 = xn+1 | Xn = xn] = P[G(xn, y)=xn+1] where y is the uniformly distributed random variable.
     
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?
Similar discussions for: Markov Chain
Loading...