Proving a process is Markovian

  • Thread starter znbhckcs
  • Start date
  • Tags
    Process
In summary, the conversation discusses the Chapman-Kolmogorov equation and its role in defining a Markov process. However, the conversation also notes that there is no proof provided for this equation and asks for a way to prove it. The response suggests comparing the distribution functions and verifying the equations in the discrete case to prove the Markov property.
  • #1
znbhckcs
14
0
Hi
In Van Kampen's "Stochastic Processes in Physics" it says (page 79) that any given 2 non negative functions: p1(y,t), p11(y1,t1 | y2,t2) that satisfy:
1. The Chapman-Kolmogorov equation: p11(y3,t3 | t1,t1) = integrate(p11(y3,t3 |y2,t2) p11(y2,t2| y1,t1).
2. p1(y2,t2)= Integrate(p11(y2,t2 |y1,t1) p1(y1,t1)dy1
define uniquely a Markov process. But unfortunately it doesn't provide a proof.

My question is, how can you prove it? I have been trying to find a way to get to the next probablity function p12(y1,t1 | y2,t2 , y3,t3) in order to show the Markov property exists, and I just can't find a way to do it.

Thanks
 
Physics news on Phys.org
  • #2
Compare the distribution functions : they're multiplicative in the equations you've quoted . Thus, the independence of probabilities over time means that the processes are
'forgetful' or Markov.
For a complete proof, verify that the equations define a markov process in the discrete case. One can then pass over to step functions & to measurable functions.
 

What is a Markovian process?

A Markovian process is a stochastic process in which the future state of the system depends only on the current state, and not on any previous states. This means that the system has no memory and only the current state is relevant in predicting future states.

How do you prove that a process is Markovian?

To prove that a process is Markovian, you must show that it satisfies the Markov property. This means that the probability of transitioning from one state to another is only dependent on the current state, and not on any previous states. This can be done through mathematical analysis and statistical tests.

What are some examples of Markovian processes?

Some examples of Markovian processes include the random walk, the Poisson process, and the Kalman filter. Many natural and man-made systems can also be modeled as Markovian processes, such as stock prices, weather patterns, and traffic flow.

What is the importance of proving that a process is Markovian?

Proving that a process is Markovian is important because it allows for simpler and more efficient modeling and analysis of complex systems. It also helps in making accurate predictions about the future behavior of the system, which can be useful in decision-making processes.

What are the limitations of Markovian processes?

Markovian processes have the limitation of assuming that the future state of the system is only dependent on the current state. This may not always be true in real-world scenarios, as external factors and past events can also influence the future behavior of a system. Additionally, Markovian processes can only model systems with discrete states, which may not be suitable for some continuous systems.

Similar threads

  • Advanced Physics Homework Help
Replies
3
Views
3K
  • Classical Physics
Replies
1
Views
910
Replies
22
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
3
Views
2K
  • Introductory Physics Homework Help
Replies
1
Views
3K
  • Introductory Physics Homework Help
Replies
1
Views
8K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
  • Introductory Physics Homework Help
Replies
18
Views
3K
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
8K
Back
Top