Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proving a process is Markovian

  1. Mar 14, 2010 #1
    Hi
    In Van Kampen's "Stochastic Processes in Physics" it says (page 79) that any given 2 non negative functions: p1(y,t), p11(y1,t1 | y2,t2) that satisfy:
    1. The Chapman-Kolmogorov equation: p11(y3,t3 | t1,t1) = integrate(p11(y3,t3 |y2,t2) p11(y2,t2| y1,t1).
    2. p1(y2,t2)= Integrate(p11(y2,t2 |y1,t1) p1(y1,t1)dy1
    define uniquely a Markov process. But unfortunately it doesn't provide a proof.

    My question is, how can you prove it? I have been trying to find a way to get to the next probablity function p12(y1,t1 | y2,t2 , y3,t3) in order to show the Markov property exists, and I just can't find a way to do it.

    Thanks
     
  2. jcsd
  3. Mar 17, 2010 #2
    Compare the distribution functions : they're multiplicative in the equations you've quoted . Thus, the independence of probabilities over time means that the processes are
    'forgetful' or Markov.
    For a complete proof, verify that the equations define a markov process in the discrete case. One can then pass over to step functions & to measurable functions.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook