# A Markov Chain as a function of dimensions

1. Aug 10, 2017

### FallenApple

Here is an animation I created in R.

I built this Markov chain of order 50 by correlating the information in one of the coordinates while randomly varying the rest. Is there an explanation for the clustering and flattening out over increasing dimensions of the vector space? Is it due to the fact that data becomes spread out over larger dimensions?

But that doesn't explain why the clusters themselves do not spread out or why other clusters condense. I've done this for much larger dimensions and it seems to reach a steady state.

The plot is of the incremental changes in a Euclidean metric vs the input, so I don't know if viewing the data as extremely spread out in higher dimensional space would translate to this plot.

Do this mean that the correlation that I induced in that coordinate is strong enough such that it keeps the cluster together regardless of how high the dimension is?

Last edited: Aug 10, 2017
2. Aug 15, 2017

### PF_Help_Bot

Thanks for the thread! This is an automated courtesy bump. Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post? The more details the better.

3. Aug 17, 2017

### StatGuy2000

I'm not sure if this would answer your question, but statisticians Gareth Roberts (of Lancaster University, later University of Warwick, UK) and Jeff Rosenthal (of the University of Toronto, and a former professor of mine when I was in grad school) wrote a paper summarizing limit theorems for Markov Chains in the context of MCMC algorithms.

https://arxiv.org/abs/math/0404033

I believe the contents of the paper will explain the specific convergence of Markov Chains and the properties of "mixing" in Markov Chains (aka the time when Markov Chains are "close" to its steady state distribution).