Proving a process is Markovian

  • Thread starter znbhckcs
  • Start date
  • #1
14
0
Hi
In Van Kampen's "Stochastic Processes in Physics" it says (page 79) that any given 2 non negative functions: p1(y,t), p11(y1,t1 | y2,t2) that satisfy:
1. The Chapman-Kolmogorov equation: p11(y3,t3 | t1,t1) = integrate(p11(y3,t3 |y2,t2) p11(y2,t2| y1,t1).
2. p1(y2,t2)= Integrate(p11(y2,t2 |y1,t1) p1(y1,t1)dy1
define uniquely a Markov process. But unfortunately it doesn't provide a proof.

My question is, how can you prove it? I have been trying to find a way to get to the next probablity function p12(y1,t1 | y2,t2 , y3,t3) in order to show the Markov property exists, and I just can't find a way to do it.

Thanks
 

Answers and Replies

  • #2
336
0
Compare the distribution functions : they're multiplicative in the equations you've quoted . Thus, the independence of probabilities over time means that the processes are
'forgetful' or Markov.
For a complete proof, verify that the equations define a markov process in the discrete case. One can then pass over to step functions & to measurable functions.
 

Related Threads on Proving a process is Markovian

Replies
0
Views
2K
  • Last Post
Replies
3
Views
8K
  • Last Post
Replies
13
Views
3K
Replies
4
Views
8K
Replies
12
Views
1K
  • Last Post
Replies
1
Views
2K
Replies
5
Views
2K
  • Last Post
Replies
4
Views
2K
Replies
4
Views
2K
Top