- **Calculus & Beyond Homework**
(*http://www.physicsforums.com/forumdisplay.php?f=156*)

- - **Markov Chain**
(*http://www.physicsforums.com/showthread.php?t=336317*)

Please Help! Markov Chain1. The problem statement, all variables and given/known dataLet X _{0} be a random variable with values in a countable set I. Let Y_{1}, Y_{2}, ... be asequence of independent random variables, uniformly distributed on [0, 1]. Suppose we are given a function G : I x [0, 1] -> I and define inductively for n >= 0, X _{n+1} = G(X_{n}, Y_{n+1}).Show that (Xn) _{n>=0} is a Markov chain and express its transition matrix P in terms of G. 2. Relevant equations3. The attempt at a solutionI know that I need to show that X _{n+1} depends on X_{n} by checking the condition in the definition of Markov chain, and thentry to find some formula for P(X _{n+1} = j | X_{n}=i) in terms of G.Actually, my background for Markov chain lacks a little, so I have no how I find some formula for P in terms of G.. How do I handle terms of G? Anybody give me some hints or answer? |

Re: Please Help! Markov ChainI guess you should only need to show P[X
_{n+1} = x_{n+1} | X_{1} = x_{1}, X_{2} = x_{2}, ..., X_{n} = x_{n}] = P[X_{n+1} = x_{n+1} | X_{n} = x_{n}]. (Markov property).In your case it holds, since G is only a function of X _{n}; and Y_{i} are independent random variables. P[X_{n+1} = x_{n+1} | X_{n} = x_{n}] = P[G(x_{n}, y)=x_{n+1}] where y is the uniformly distributed random variable. |

All times are GMT -5. The time now is 03:32 AM. |

Powered by vBulletin Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.

© 2014 Physics Forums