View Single Post
mathman
#3
Mar31-07, 07:13 PM
Sci Advisor
P: 6,077
A Markov chain is a sequence of random variables where the distribution of a given state depends on the immediately preceding state, but not on what happened before then.