Learning Markov Chain: Clarifying Persistent vs Regular

In summary, the person is trying to understand the concept of Markov Chain and is confused about the terms "Persistent" and "Recurrent". They ask for clarification and are told that "Recurrent" means "happening over and over again" and that "regular" has multiple meanings in general English. They also ask about the relationship between "Persistent" and "Absorbing" chains, and are given an explanation using graph representations. They then ask about periodic and aperiodic chains, and are given an explanation of how periodicity and aperiodicity are determined based on the length of paths in the transition matrix.
  • #1
woundedtiger4
188
0
Hi all,
I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at:

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

I am lil' confused at the moment that is Persistent ( or recurrent) Markov chain is also known as absorbing Markov Chain or Regular Markov Chain. My weak understanding tells me that Persistent is absorbing, but in English dictionary recurrent means regular.

Please someone correct me.
 

Attachments

  • Markovian.pdf
    1.4 MB · Views: 470
Physics news on Phys.org
  • #2
woundedtiger4 said:
Hi all,
I am trying to understand the concept of Markov Chain (a scan copy of 2 pages is attached), before this text I already studied the notes on Markov Chains at:

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

I am lil' confused at the moment that is Persistent ( or recurrent) Markov chain is also known as absorbing Markov Chain or Regular Markov Chain. My weak understanding tells me that Persistent is absorbing, but in English dictionary recurrent means regular.

Please someone correct me.
"Recurrent" means "happening over and over again". In general English, "regular" has many different meanings. There are, of course, a variety of meaning of "regular" in mathematics but they are each more limited than the general English meaning.
 
  • #3
HallsofIvy said:
"Recurrent" means "happening over and over again". In general English, "regular" has many different meanings. There are, of course, a variety of meaning of "regular" in mathematics but they are each more limited than the general English meaning.
Do you mean that Persistent chain is not an absorbing chain?
 
  • #4
woundedtiger4 said:
Do you mean that Persistent chain is not an absorbing chain?

You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).
 
  • #5
chiro said:
You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).

thanks a tonne for the reply so persistent chain is ergodic? if yes then do we also call it irreducible chain as well?
Also, can you please teach me what is periodic & aperiodic chains in the similar way?
 
  • #6
The periodicity can be seen in terms of the length of the actual paths.

You have a periodicity of n if there exists a path of n-transitions from the node back to the node: the periodicity is the minimum number of steps required for this to happen. You can think of it like a wave where the periodicity is usually 2pi, but in the wave you can return back to the same point at 2pi,4pi,6pi etc: and this is the same with the transition matrix: we pick the minimum value which has the analog of 2pi in the wave situation.

The aperiodic part means that there is a path that exists from node back to node that is 'out of phase' with the period. So basically it is saying that if you can find a path from node back to the node that isn't a multiple of the period, then you have an aperiodic situation.

So for example if you had a period of 2 and there existed a path (i.e. a non-zero probability) from node back to the node of 3 transitions, this node would be a aperiodic.
 
  • #7
chiro said:
You can have recurrent systems that have absorbing states, but something is not recurrent if there exists a graph representation of the system whereby there exists a sub-graph which is acyclic with respect to a node.

To break this down, basically what you do is you look at a state and then make sure that for each path from that node, a path exists that cycles back to it. If you find there is a path where this doesn't happen, the markov chain is not recurrent.

To see how you can have absorbing states, picture an identity matrix as a transition matrix: every state loops back on itself at the first (and only) path for that node.

Similarly you could have a transition matrix with 0 0 1 in the third row and a 2x2 inner matrix that is recurrent.

Essentially, the easiest way to check if a system is recurrent is to make it into a graph and then check that for every path from a state has a path back to the node that the path started from.

The graph should look something like this:

http://en.wikipedia.org/wiki/File:MarkovChain1.png

If you ever find that when you start from one node that there is no path back to that node, then the markov chain is not recurrent (persistent).

chiro said:
The periodicity can be seen in terms of the length of the actual paths.

You have a periodicity of n if there exists a path of n-transitions from the node back to the node: the periodicity is the minimum number of steps required for this to happen. You can think of it like a wave where the periodicity is usually 2pi, but in the wave you can return back to the same point at 2pi,4pi,6pi etc: and this is the same with the transition matrix: we pick the minimum value which has the analog of 2pi in the wave situation.

The aperiodic part means that there is a path that exists from node back to node that is 'out of phase' with the period. So basically it is saying that if you can find a path from node back to the node that isn't a multiple of the period, then you have an aperiodic situation.

So for example if you had a period of 2 and there existed a path (i.e. a non-zero probability) from node back to the node of 3 transitions, this node would be a aperiodic.

THANKS...
How to find the period of the chain?
 
  • #8
woundedtiger4 said:
THANKS...
How to find the period of the chain?

Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.
 
  • #9
chiro said:
Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.

thanks a tonne, I think I got it now...GCD is the period, isn't? in the wave example the 2pi is period (GCD), am I right?
 
  • #10
Yes that looks right: remember that like in a number, the period must go evenly into both numbers where one number is the minimum path length and the other is the length of any valid path cycle.

So if we have say a minimum path length of 2 and every path can only be a multiple of 2, then the chain is not aperiodic because every path length divides equally into the minimum path length.

But if there was a possible path of 5 transitions long, then gcd(5,2) = 1 which means there is an aperiodic situation and the markov chain is aperiodic.
 
  • #11
chiro said:
Hint: Draw the graph of the transition matrix (like the diagram I included above) and then use the advice I posted previously.

chiro said:
Yes that looks right: remember that like in a number, the period must go evenly into both numbers where one number is the minimum path length and the other is the length of any valid path cycle.

So if we have say a minimum path length of 2 and every path can only be a multiple of 2, then the chain is not aperiodic because every path length divides equally into the minimum path length.

But if there was a possible path of 5 transitions long, then gcd(5,2) = 1 which means there is an aperiodic situation and the markov chain is aperiodic.

Sir, thank you so much
 

1. What is a Markov chain?

A Markov chain is a mathematical model used to describe a sequence of events where the probability of transitioning from one event to another depends only on the current state of the system, and not on any previous states. It is typically represented as a graph or matrix, with each node representing a state and each edge representing a transition between states.

2. What is the difference between persistent and regular Markov chains?

Persistent Markov chains are those where every state is eventually revisited, meaning that the system has a non-zero probability of returning to any state after a certain number of transitions. Regular Markov chains, on the other hand, have a limiting distribution where the probability of being in each state approaches a fixed value as the number of transitions increases.

3. How are Markov chains used in learning?

Markov chains can be used in machine learning as a way to model the behavior of a system and make predictions about future states. They are particularly useful in reinforcement learning, where the goal is to learn the optimal actions to take in a given environment based on a sequence of rewards.

4. What are some real-world examples of Markov chains?

Markov chains have many applications in various fields, such as finance, biology, and natural language processing. Some common examples include predicting stock market prices, analyzing DNA sequences, and generating text using a language model.

5. How can I learn more about Markov chains?

There are many online resources available for learning about Markov chains, including tutorials, videos, and courses. It is also helpful to have a strong understanding of probability and linear algebra. Practicing with coding exercises and implementing Markov chain models can also aid in understanding the concept better.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
6K
Replies
22
Views
22K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
8
Views
3K
  • General Discussion
Replies
11
Views
25K
Back
Top