(adsbygoogle = window.adsbygoogle || []).push({}); Please Help! Markov Chain

1. The problem statement, all variables and given/known data

Let X_{0}be a random variable with values in a countable set I. Let Y_{1}, Y_{2}, ... be a

sequence of independent random variables, uniformly distributed on [0, 1].

Suppose we are given a function G : I x [0, 1] -> I

and define inductively for n >= 0, X_{n+1}= G(X_{n}, Y_{n+1}).

Show that (Xn)_{n>=0}is a Markov chain and

express its transition matrix P in terms of G.

2. Relevant equations

3. The attempt at a solution

I know that I need to show that X_{n+1}depends on X_{n}by checking the condition in the definition of Markov chain, and then

try to find some formula for P(X_{n+1}= j | X_{n}=i) in terms of G.

Actually, my background for Markov chain lacks a little, so I have no how I find some formula for P in terms of G.. How do I handle terms of G?

Anybody give me some hints or answer?

**Physics Forums - The Fusion of Science and Community**

# Markov Chain

Know someone interested in this topic? Share a link to this question via email,
Google+,
Twitter, or
Facebook

Have something to add?

- Similar discussions for: Markov Chain

Loading...

**Physics Forums - The Fusion of Science and Community**