# Markov chains

1. Aug 2, 2012

### woundedtiger4

Hi all,
I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at:

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

I am lil' confused at the moment that is Persistent ( or recurrent) Markov chain is also known as absorbing Markov Chain or Regular Markov Chain. My weak understanding tells me that Persistent is absorbing, but in English dictionary recurrent means regular.

File size:
1.4 MB
Views:
122
2. Aug 2, 2012

### HallsofIvy

"Recurrent" means "happening over and over again". In general English, "regular" has many different meanings. There are, of course, a variety of meaning of "regular" in mathematics but they are each more limited than the general English meaning.

3. Aug 3, 2012

### woundedtiger4

Do you mean that Persistent chain is not an absorbing chain?

4. Aug 3, 2012

### chiro

You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).

5. Aug 3, 2012

### woundedtiger4

thanks a tonne for the reply so persistent chain is ergodic? if yes then do we also call it irreducible chain as well?
Also, can you please teach me what is periodic & aperiodic chains in the similar way?

6. Aug 3, 2012

### chiro

The periodicity can be seen in terms of the length of the actual paths.

You have a periodicity of n if there exists a path of n-transitions from the node back to the node: the periodicity is the minimum number of steps required for this to happen. You can think of it like a wave where the periodicity is usually 2pi, but in the wave you can return back to the same point at 2pi,4pi,6pi etc: and this is the same with the transition matrix: we pick the minimum value which has the analog of 2pi in the wave situation.

The aperiodic part means that there is a path that exists from node back to node that is 'out of phase' with the period. So basically it is saying that if you can find a path from node back to the node that isn't a multiple of the period, then you have an aperiodic situation.

So for example if you had a period of 2 and there existed a path (i.e. a non-zero probability) from node back to the node of 3 transitions, this node would be a aperiodic.

7. Aug 4, 2012

### woundedtiger4

THANKS...
How to find the period of the chain?

8. Aug 4, 2012

### chiro

Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.

9. Aug 5, 2012

### woundedtiger4

thanks a tonne, I think I got it now...............GCD is the period, isn't? in the wave example the 2pi is period (GCD), am I right?

10. Aug 5, 2012

### chiro

Yes that looks right: remember that like in a number, the period must go evenly into both numbers where one number is the minimum path length and the other is the length of any valid path cycle.

So if we have say a minimum path length of 2 and every path can only be a multiple of 2, then the chain is not aperiodic because every path length divides equally into the minimum path length.

But if there was a possible path of 5 transitions long, then gcd(5,2) = 1 which means there is an aperiodic situation and the markov chain is aperiodic.

11. Aug 6, 2012

### woundedtiger4

Sir, thank you so much