Register to reply 
How Prove a process is Markov? 
Share this thread: 
#1
Aug305, 11:15 AM

P: 4

This is my first time here, so... Hi everybody!
I've very little time to figure out the following problem ... and Im wandering if some of you can give me any help or just suggest me any good reading material... The question is how you can prove a process [tex] P_t[/tex], given the dynamics, is Markov. In short my process is on alternate intervals, a mean reverting brownian bridge [tex]dP_t = \frac{\alpha}{Gt}(QP_t)dt + \sigma dW_t [/tex], and a mean reverting proportional volatility process : [tex]dP_t = K(\theta P_t)dt + \nu dW_t [/tex]. The length of the intervals and their occurence is determined by an exogenous bootstrap procedure, which I believe, doesn't give any problems, being a resampling procedure with replacement, it doesn't generate any dependence with the past history... How should I procede on your opinion? Any hints ? Thank you very much in advance, Vale 


#2
Aug305, 03:38 PM

Sci Advisor
HW Helper
P: 2,483

Could this be of any help? When you say prove, do you mean empirically or mathematically?



#3
Aug405, 04:44 AM

P: 4

Thank you for the reference and the reply!
Actually I meant a mathematical proof.... I think I should show somehow the transition probabilities are independent from the past realizations... but I don't Know how to retrieve them from the dynamic... Many thanks... Vale 


#4
Aug405, 09:54 AM

Sci Advisor
HW Helper
P: 2,483

How Prove a process is Markov?
I guess I'd argue W(t) is independent of the past. Then equate Eq. (1) to Eq. (2) and solve for P(t). It'll be a function of t, W(t) and some constants. Since W is independent of the past, so's P(t).
{P.S. Oh, whoops! You said "on alternating intervals." Does that mean the two Eq's do not hold simultaneously?} {P.P.S. In that case: P(t+1) = P(t) + dP(t) = P(t) + a(dt) P(t) + b dW(t) = [a(dt)+1] P(t) + b dW(t). E_{t+1}[P(t+1)P(t),P(t1)...,P(0)] = (a+1) E_{t+1}[P(t)P(t),P(t1)...,P(0)] + b E_{t+1}[dW(t)P(t),P(t1)...,P(0)] = (a+1) P(t) + b E_{t+1}[dW(t)]. QED The last step is based on two premises: (i) E[XX,Y,Z,...] = X, and (ii) dW(t) is independent of past history so E[dW(t)P(t),P(t1)...,P(0)] = E[dW(t)].} 


Register to reply 
Related Discussions  
Proving that an ODE is a markov process !  Set Theory, Logic, Probability, Statistics  0  
Definition of markov process with ODE  Differential Equations  0  
Markov process  Set Theory, Logic, Probability, Statistics  4  
How to prove the output of Linear Filtering a Gaussian Process is still Gaussian?  Set Theory, Logic, Probability, Statistics  1  
Proof that a stochastic process isn't a Markov Process  Set Theory, Logic, Probability, Statistics  4 