Learning Markov Chain: Clarifying Persistent vs Regular

AI Thread Summary
The discussion clarifies the distinctions between persistent (or recurrent) Markov chains and absorbing or regular Markov chains. It emphasizes that a persistent chain is not necessarily absorbing, as recurrent systems can have absorbing states, but a Markov chain is not recurrent if there exists a path that does not return to the original state. The concept of periodicity is explained, indicating that a chain has a period if there are paths of specific lengths returning to the same node, while aperiodicity occurs when paths of varying lengths exist. The method to determine the period involves finding the greatest common divisor (GCD) of the lengths of the paths. Understanding these concepts is crucial for analyzing Markov chains effectively.
woundedtiger4
Messages
188
Reaction score
0
Hi all,
I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at:

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

I am lil' confused at the moment that is Persistent ( or recurrent) Markov chain is also known as absorbing Markov Chain or Regular Markov Chain. My weak understanding tells me that Persistent is absorbing, but in English dictionary recurrent means regular.

Please someone correct me.
 

Attachments

Physics news on Phys.org
woundedtiger4 said:
Hi all,
I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at:

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

I am lil' confused at the moment that is Persistent ( or recurrent) Markov chain is also known as absorbing Markov Chain or Regular Markov Chain. My weak understanding tells me that Persistent is absorbing, but in English dictionary recurrent means regular.

Please someone correct me.
"Recurrent" means "happening over and over again". In general English, "regular" has many different meanings. There are, of course, a variety of meaning of "regular" in mathematics but they are each more limited than the general English meaning.
 
HallsofIvy said:
"Recurrent" means "happening over and over again". In general English, "regular" has many different meanings. There are, of course, a variety of meaning of "regular" in mathematics but they are each more limited than the general English meaning.
Do you mean that Persistent chain is not an absorbing chain?
 
woundedtiger4 said:
Do you mean that Persistent chain is not an absorbing chain?

You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).
 
chiro said:
You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).

thanks a tonne for the reply so persistent chain is ergodic? if yes then do we also call it irreducible chain as well?
Also, can you please teach me what is periodic & aperiodic chains in the similar way?
 
The periodicity can be seen in terms of the length of the actual paths.

You have a periodicity of n if there exists a path of n-transitions from the node back to the node: the periodicity is the minimum number of steps required for this to happen. You can think of it like a wave where the periodicity is usually 2pi, but in the wave you can return back to the same point at 2pi,4pi,6pi etc: and this is the same with the transition matrix: we pick the minimum value which has the analog of 2pi in the wave situation.

The aperiodic part means that there is a path that exists from node back to node that is 'out of phase' with the period. So basically it is saying that if you can find a path from node back to the node that isn't a multiple of the period, then you have an aperiodic situation.

So for example if you had a period of 2 and there existed a path (i.e. a non-zero probability) from node back to the node of 3 transitions, this node would be a aperiodic.
 
chiro said:
You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).

chiro said:
The periodicity can be seen in terms of the length of the actual paths.

You have a periodicity of n if there exists a path of n-transitions from the node back to the node: the periodicity is the minimum number of steps required for this to happen. You can think of it like a wave where the periodicity is usually 2pi, but in the wave you can return back to the same point at 2pi,4pi,6pi etc: and this is the same with the transition matrix: we pick the minimum value which has the analog of 2pi in the wave situation.

The aperiodic part means that there is a path that exists from node back to node that is 'out of phase' with the period. So basically it is saying that if you can find a path from node back to the node that isn't a multiple of the period, then you have an aperiodic situation.

So for example if you had a period of 2 and there existed a path (i.e. a non-zero probability) from node back to the node of 3 transitions, this node would be a aperiodic.

THANKS...
How to find the period of the chain?
 
woundedtiger4 said:
THANKS...
How to find the period of the chain?

Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.
 
chiro said:
Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.

thanks a tonne, I think I got it now...GCD is the period, isn't? in the wave example the 2pi is period (GCD), am I right?
 
  • #10
Yes that looks right: remember that like in a number, the period must go evenly into both numbers where one number is the minimum path length and the other is the length of any valid path cycle.

So if we have say a minimum path length of 2 and every path can only be a multiple of 2, then the chain is not aperiodic because every path length divides equally into the minimum path length.

But if there was a possible path of 5 transitions long, then gcd(5,2) = 1 which means there is an aperiodic situation and the markov chain is aperiodic.
 
  • #11
chiro said:
Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.

chiro said:
Yes that looks right: remember that like in a number, the period must go evenly into both numbers where one number is the minimum path length and the other is the length of any valid path cycle.

So if we have say a minimum path length of 2 and every path can only be a multiple of 2, then the chain is not aperiodic because every path length divides equally into the minimum path length.

But if there was a possible path of 5 transitions long, then gcd(5,2) = 1 which means there is an aperiodic situation and the markov chain is aperiodic.

Sir, thank you so much
 

Similar threads

Replies
5
Views
2K
Replies
6
Views
2K
Replies
1
Views
2K
Replies
3
Views
2K
Replies
8
Views
5K
Replies
11
Views
27K
Back
Top