Physics Forums

Physics Forums (http://www.physicsforums.com/index.php)
-   Calculus & Beyond Homework (http://www.physicsforums.com/forumdisplay.php?f=156)
-   -   Please Help! Markov Chain (http://www.physicsforums.com/showthread.php?t=336317)

hsong9 Sep10-09 08:29 PM

Please Help! Markov Chain
 
1. The problem statement, all variables and given/known data
Let X0 be a random variable with values in a countable set I. Let Y1, Y2, ... be a
sequence of independent random variables, uniformly distributed on [0, 1].
Suppose we are given a function G : I x [0, 1] -> I
and define inductively for n >= 0, Xn+1 = G(Xn, Yn+1).
Show that (Xn)n>=0 is a Markov chain and
express its transition matrix P in terms of G.



2. Relevant equations



3. The attempt at a solution
I know that I need to show that Xn+1 depends on Xn by checking the condition in the definition of Markov chain, and then
try to find some formula for P(Xn+1 = j | Xn=i) in terms of G.

Actually, my background for Markov chain lacks a little, so I have no how I find some formula for P in terms of G.. How do I handle terms of G?
Anybody give me some hints or answer?

fi1 Sep5-10 03:06 PM

Re: Please Help! Markov Chain
 
I guess you should only need to show P[Xn+1 = xn+1 | X1 = x1, X2 = x2, ..., Xn = xn] = P[Xn+1 = xn+1 | Xn = xn]. (Markov property).

In your case it holds, since G is only a function of Xn; and Yi are independent random variables. P[Xn+1 = xn+1 | Xn = xn] = P[G(xn, y)=xn+1] where y is the uniformly distributed random variable.


All times are GMT -5. The time now is 12:28 PM.

Powered by vBulletin Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
© 2014 Physics Forums