Thread
:
What's a Markov Chain?
View Single Post
mathman
#
3
Mar31-07, 07:13 PM
Sci Advisor
P: 5,939
A Markov chain is a sequence of random variables where the distribution of a given state depends on the immediately preceding state, but not on what happened before then.