What is the difference between martingale and markov chain. As it seems apparently, if a process is a martingale, then the future expected value is dependent on the current value of the process while in markov chain the probability of future value (not the expected value) is dependent on the current value only. Are the following true?(adsbygoogle = window.adsbygoogle || []).push({});

1. Martingale is a subset of markov processes because there can be many markov processes whose expected future value is not equal to the current value.

2. If martingale is strictly a markov process then the only difference is that in a markov process we relate the future probability of a value to past observations while in a martingale we relate the future expected value given all past observations.

If I am unable to explain my confusion, please elaborate generally what are the main differences between these two.

Thanks

**Physics Forums - The Fusion of Science and Community**

# Difference between martingale and markov chain

Know someone interested in this topic? Share a link to this question via email,
Google+,
Twitter, or
Facebook

Have something to add?

- Similar discussions for: Difference between martingale and markov chain

Loading...

**Physics Forums - The Fusion of Science and Community**