1. PF Contest - Win "Conquering the Physics GRE" book! Click Here to Enter
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov Chain

  1. Sep 10, 2009 #1
    Please Help! Markov Chain

    1. The problem statement, all variables and given/known data
    Let X0 be a random variable with values in a countable set I. Let Y1, Y2, ... be a
    sequence of independent random variables, uniformly distributed on [0, 1].
    Suppose we are given a function G : I x [0, 1] -> I
    and define inductively for n >= 0, Xn+1 = G(Xn, Yn+1).
    Show that (Xn)n>=0 is a Markov chain and
    express its transition matrix P in terms of G.

    2. Relevant equations

    3. The attempt at a solution
    I know that I need to show that Xn+1 depends on Xn by checking the condition in the definition of Markov chain, and then
    try to find some formula for P(Xn+1 = j | Xn=i) in terms of G.

    Actually, my background for Markov chain lacks a little, so I have no how I find some formula for P in terms of G.. How do I handle terms of G?
    Anybody give me some hints or answer?
  2. jcsd
  3. Sep 5, 2010 #2


    User Avatar

    Re: Please Help! Markov Chain

    I guess you should only need to show P[Xn+1 = xn+1 | X1 = x1, X2 = x2, ..., Xn = xn] = P[Xn+1 = xn+1 | Xn = xn]. (Markov property).

    In your case it holds, since G is only a function of Xn; and Yi are independent random variables. P[Xn+1 = xn+1 | Xn = xn] = P[G(xn, y)=xn+1] where y is the uniformly distributed random variable.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook