How Prove a process is Markov?

  • Thread starter vale
  • Start date
  • Tags
    Process
In summary, the conversation is about proving whether a given process is Markov or not. The process in question has two equations, one involving a mean reverting brownian bridge and the other involving a mean reverting proportional volatility. The intervals and occurrences of these equations are determined by an exogenous bootstrap procedure. The person seeking help wants to know how to prove mathematically that the transition probabilities are independent from past realizations. A possible solution is suggested involving equating the two equations and showing that the process is independent of past history.
  • #1
vale
3
0
This is my first time here, so... Hi everybody!

I've very little time to figure out the following problem ... and I am wandering if some of you can give me any help or just suggest me any good reading material...

The question is how you can prove a process [tex] P_t[/tex], given the dynamics, is Markov.
In short my process is on alternate intervals, a mean reverting brownian bridge [tex]dP_t = \frac{\alpha}{G-t}(Q-P_t)dt + \sigma dW_t [/tex], and a mean reverting proportional volatility process : [tex]dP_t = K(\theta -P_t)dt + \nu dW_t [/tex]. The length of the intervals and their occurence is determined by an exogenous bootstrap procedure, which I believe, doesn't give any problems, being a resampling procedure with replacement, it doesn't generate any dependence with the past history...

How should I procede on your opinion? Any hints ?

Thank you very much in advance,
Vale
 
Physics news on Phys.org
  • #2
Could this be of any help? When you say prove, do you mean empirically or mathematically?
 
  • #3
Thank you for the reference and the reply!

Actually I meant a mathematical proof...
I think I should show somehow the transition probabilities are independent from the past realizations... but I don't Know how to retrieve them from the dynamic... :uhh:

Many thanks...
Vale
 
  • #4
I guess I'd argue W(t) is independent of the past. Then equate Eq. (1) to Eq. (2) and solve for P(t). It'll be a function of t, W(t) and some constants. Since W is independent of the past, so's P(t).

{P.S. Oh, whoops! You said "on alternating intervals." Does that mean the two Eq's do not hold simultaneously?}

{P.P.S. In that case:

P(t+1) = P(t) + dP(t) = P(t) + a(dt) P(t) + b dW(t) = [a(dt)+1] P(t) + b dW(t).

Et+1[P(t+1)|P(t),P(t-1)...,P(0)] = (a+1) Et+1[P(t)|P(t),P(t-1)...,P(0)] + b Et+1[dW(t)|P(t),P(t-1)...,P(0)] = (a+1) P(t) + b Et+1[dW(t)]. QED

The last step is based on two premises: (i) E[X|X,Y,Z,...] = X, and (ii) dW(t) is independent of past history so E[dW(t)|P(t),P(t-1)...,P(0)] = E[dW(t)].}
 
Last edited:

1. What is a Markov process?

A Markov process is a stochastic model that describes a sequence of events where the probability of each event only depends on the current state and not on the past events. It is also known as a Markov chain.

2. How do you prove that a process is Markov?

To prove that a process is Markov, we need to show that the probability of transitioning from one state to another depends only on the current state and not on the previous states. This can be done by using the Markov property, which states that the conditional probability of the next state only depends on the current state and not on the previous states.

3. What are the key elements of a Markov process?

The key elements of a Markov process are the states, transition probabilities, and initial state distribution. The states represent the possible outcomes of the process, the transition probabilities determine the likelihood of moving from one state to another, and the initial state distribution determines the probability of starting in each state.

4. Can a process be both Markov and non-Markov?

No, a process cannot be both Markov and non-Markov. A process is either Markov or not, depending on whether it satisfies the Markov property or not. If the Markov property is satisfied, the process is considered Markov, and if not, it is considered non-Markov.

5. What are some applications of Markov processes?

Markov processes have various applications in different fields, such as finance, economics, biology, and computer science. They are used to model and analyze various systems, including stock prices, weather patterns, population dynamics, and machine learning algorithms.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Special and General Relativity
2
Replies
40
Views
2K
Replies
9
Views
3K
Replies
6
Views
6K
  • Calculus and Beyond Homework Help
Replies
1
Views
6K
  • General Discussion
Replies
2
Views
2K
Back
Top