Proof that a stochastic process isn't a Markov Process

In summary, the problem is not a Markov process, and to prove that it satisfies Chapman-Kolmogorov equations, the author needs to provide a counterexample that shows that P(X_{3(m-1)+3}=1|X_{3(m-1)+2}=1 \mbox{ and }X_{3(m-1)+1}=1) doesn't equal P(X_{3(m-1)+3}=1|X_{3(m-1)+2}=1).
  • #1
gesteves
3
0
I've been trying to solve this problem for a week now, but haven't been able to. Basically I need to prove that a certain process satisfies Chapman-Kolmogorov equations, yet it isn't a Markov Process (it doesn't satisfy the Markovian Property).

I attached the problem as a .doc below.

Please, I really need a little help here.
 

Attachments

  • problem.doc
    23 KB · Views: 398
Physics news on Phys.org
  • #2
hi gesteves!

I read your question, and I think it is readily seen to be not markov (because it is easily seen that [tex]P(X_{3(m-1)+3}=1|X_{3(m-1)+2}=1 \mbox{ and }X_{3(m-1)+1}=1)[/tex] does not equal [tex]P(X_{3(m-1)+3}=1|X_{3(m-1)+2}=1)[/tex]). In other words, since [tex]X_{3(m-1)+3}, X_{3(m-1)+2}, X_{3(m-1)+1}[/tex] are giving information about the *same draw* from the mth box, most probably these variables are not independent and the proof should take into account of this.

Also note that [tex]X_{3(n-1)+i}, X_{3(m-1)+j}, 1\leq i, j\leq 3[/tex] are independent when m and n are different, as they correspond to different draws.

Since [tex]X_{3(n-1)+i}, X_{3(m-1)+j}, 1\leq i, j\leq 3[/tex] are independent when m and n are different, then [tex]P(X_{3(n-1)+i}=l|X_{3(m-1)+j}=k) = P(X_{3(n-1)+i}=l) = \frac{1}{2}, 0\leq l,k\leq 1[/tex] for different m and n.

As to the case when m and n are the same, it is necessary to calculate the probability explicity. But amazingly you will find that [tex]P(X_{3(m-1)+i}=l|X_{3(m-1)+j}=k)=\frac{1}{2}, 1\leq i,j\leq 3, 0\leq l,k \leq 1[/tex]. For example, [tex]P(X_{3(1-1)+2}=1|X_{3(m-1)+1}=1)=P(\mbox{1 or 2 in the first draw}|\mbox{1 or 4 in the first draw}) = \frac{1}{2}[/tex].

Since all conditional probabilities are essentially 1/2, I think the assertion thus holds.
 
Last edited:
  • #3
Hi Wong,

Thanks for your quick reply! If I understood correctly, all I need to prove that it isn't a Markov Process is a counterexample that shows that [tex]P(X_{3(m-1)+3}=1|X_{3(m-1)+2}=1 \mbox{ and }X_{3(m-1)+1}=1)[/tex] doesn't equal [tex]P(X_{3(m-1)+3}=1|X_{3(m-1)+2}=1)[/tex]. For m = 1, [tex]P(X_{3}=1|X_{2}=1 \mbox{ and }X_{1}=1) = 0[/tex] and [tex]P(X_{3}=1|X_{2}=1)=1/2[/tex]. Therefore it isn't a Markov Process.

But how can I prove that it satisfies Chapman-Kolmogorov? I'll try to prove it on my own, but I could use some pointers.

Thanks in advance.
 
  • #4
Yes, gesteves, you got the non-markov part.

As for the Chapman-Kolmogorov part, you may first think of the form of the equation. If I am not mistaken, the Chapman-Kolmogorov equation says that [tex]P(X_{m+n+l}=i|X_{l}=j) = \sum_{k}P(X_{m+n+l}=i|X_{m+l}=k)P(X_{m+l}=k|X_{l}=j)[/tex]. In my first post, I already gave you the various conditional probabilities for the equation. You may just "plug in" and see whether the LHS agrees with the RHS.
 
  • #5
I finally finished it. Thanks for all your help.
 

1) What is a stochastic process?

A stochastic process is a mathematical model that describes the evolution of a random variable over time. It is a collection of random variables that can be used to model real-world phenomena, such as stock prices or weather patterns.

2) What is a Markov process?

A Markov process, also known as a Markov chain, is a type of stochastic process where the future state of the system depends only on its current state, not on any previous states. This means the process has the Markov property, which is memorylessness.

3) How can you prove that a stochastic process isn't a Markov process?

To prove that a stochastic process is not a Markov process, you can demonstrate that the future state of the system depends on more than just its current state. This could be due to the influence of previous states or external factors that are not accounted for in the model.

4) What are some common examples of stochastic processes that are not Markov processes?

Many real-world phenomena, such as stock prices, weather patterns, and biological systems, are not Markov processes. This is because they are influenced by a variety of factors that cannot be fully captured by a simple model with only the current state as a predictor.

5) How important is it to accurately model a process as a Markov process?

Accurately modeling a process as a Markov process is important when the Markov property holds, as it simplifies the mathematical analysis and can provide insights into the behavior of the system. However, in cases where the Markov property does not hold, it is more important to use a more complex model that takes into account all relevant factors to accurately predict the behavior of the system.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
1K
  • Precalculus Mathematics Homework Help
Replies
4
Views
717
Replies
93
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
823
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
2K
Replies
22
Views
2K
Replies
212
Views
21K
Back
Top