Please Help! Markov Chain 1. The problem statement, all variables and given/known data Let X_{0} be a random variable with values in a countable set I. Let Y_{1}, Y_{2}, ... be a sequence of independent random variables, uniformly distributed on [0, 1]. Suppose we are given a function G : I x [0, 1] -> I and define inductively for n >= 0, X_{n+1} = G(X_{n}, Y_{n+1}). Show that (Xn)_{n>=0} is a Markov chain and express its transition matrix P in terms of G. 2. Relevant equations 3. The attempt at a solution I know that I need to show that X_{n+1} depends on X_{n} by checking the condition in the definition of Markov chain, and then try to find some formula for P(X_{n+1} = j | X_{n}=i) in terms of G. Actually, my background for Markov chain lacks a little, so I have no how I find some formula for P in terms of G.. How do I handle terms of G? Anybody give me some hints or answer?
Re: Please Help! Markov Chain I guess you should only need to show P[X_{n+1} = x_{n+1} | X_{1} = x_{1}, X_{2} = x_{2}, ..., X_{n} = x_{n}] = P[X_{n+1} = x_{n+1} | X_{n} = x_{n}]. (Markov property). In your case it holds, since G is only a function of X_{n}; and Y_{i} are independent random variables. P[X_{n+1} = x_{n+1} | X_{n} = x_{n}] = P[G(x_{n}, y)=x_{n+1}] where y is the uniformly distributed random variable.