Difference between martingale and markov chain


by ait.abd
Tags: chain, difference, markov, martingale
ait.abd
ait.abd is offline
#1
Mar2-11, 03:00 PM
P: 25
What is the difference between martingale and markov chain. As it seems apparently, if a process is a martingale, then the future expected value is dependent on the current value of the process while in markov chain the probability of future value (not the expected value) is dependent on the current value only. Are the following true?
1. Martingale is a subset of markov processes because there can be many markov processes whose expected future value is not equal to the current value.
2. If martingale is strictly a markov process then the only difference is that in a markov process we relate the future probability of a value to past observations while in a martingale we relate the future expected value given all past observations.

If I am unable to explain my confusion, please elaborate generally what are the main differences between these two.

Thanks
Phys.Org News Partner Science news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
The hemihelix: Scientists discover a new shape using rubber bands (w/ video)
Microbes provide insights into evolution of human language
mathman
mathman is offline
#2
Mar2-11, 03:58 PM
Sci Advisor
P: 5,942
You seem to have the correct idea. A martingale is a special kind of Markov process. As you appear to understand the distribution function of the future of a Markov process is dependent only on the current state, and independent of previous states. Also as you know a martingale includes in its definition that the expectation of future value is the current value.
The first of your two statements is true. The second is a little ambiguous. "Past" in the definition should mean the latest known condition, but not anything before that.
mathman
mathman is offline
#3
Mar3-11, 04:08 PM
Sci Advisor
P: 5,942
I looked at the definition of martingale carefully and it seems to me that it does NOT have to be a Markov process. For example consider a sequence of random variables, each of which has a normal distribution with some mean and variance. To be a martingale, the mean of each variable has to be the value of the previous variable. However the variance could depend on the entire sequence up to that point, so it would not be a Markov process.

BWV
BWV is offline
#4
Mar3-11, 04:16 PM
P: 328

Difference between martingale and markov chain


A martingale can have "memory" you could take a brownian motion with stochastic, autoregressive variance (i.e. a GARCH model) that would not be a Markov process but could still be a martingale


Register to reply

Related Discussions
Please Help! Markov Chain Calculus & Beyond Homework 1
markov chain Calculus & Beyond Homework 0
Is Martingale difference sequence strictly stationary and ergodic? Calculus 1
Is Martingale difference sequence strictly stationary and ergodic? General Math 0
Markov chain Set Theory, Logic, Probability, Statistics 1