Markov chains: period of a state

  • Thread starter Thread starter azay
  • Start date Start date
  • Tags Tags
    Period State
AI Thread Summary
The discussion centers on understanding the concept of the period of a state in a Markov chain, particularly when the greatest common divisor (gcd) of the steps to return to a state is not indicative of all possible return intervals. An example is given where state i can be revisited in 4 or 18 steps, leading to a gcd of 2, but it is noted that state i cannot be revisited in every multiple of 2. Participants clarify that while exact multiples of the gcd may not apply, other combinations can allow returns to the state at various intervals. The conversation emphasizes the importance of transition probabilities and the structure of the Markov process in determining state revisitation patterns. Ultimately, the complexities of Markov chains and their behaviors are acknowledged, confirming the validity of the theoretical example presented.
azay
Messages
18
Reaction score
0
Hello,

I am trying to understand the intuition of the definition of the period of a state in a Markov chain.

Say for example we can go from state i to state i in either 4 steps or either 18 steps.
gcd(4,18)=2, with gcd=greatest common divisor. So the period of state i is equal to 2.
I find strange in this case because we cannot revisit state i in every multiple of 2. What's the intuition behind this?

Thanks
 
Physics news on Phys.org
azay said:
Hello,

I am trying to understand the intuition of the definition of the period of a state in a Markov chain.

Say for example we can go from state i to state i in either 4 steps or either 18 steps.
gcd(4,18)=2, with gcd=greatest common divisor. So the period of state i is equal to 2.
I find strange in this case because we cannot revisit state i in every multiple of 2. What's the intuition behind this?

Thanks

By definition, a Markov process is where a state depends only on the previous state. Given 1,2, ...,i,...,n-1,n available states the probability of state S_i is dependent only on state S_{i-1}. If reflexivity is allowed then S_i can equal S_{i-1}. The duration of a state is usually irrelevant in the mathematical context. The path taken is dependent on the on the probabilities of each transition.

If the chain is structured in such a way that k\geq2 than periods may be defined for some gcd k. k is the minimum interval of states before the process can return to a previous state. However intervals of multiples of k are allowed.
 
Last edited:
SW VandeCarr said:
If the chain is structured in such a way that k\geq2 than periods may be defined for some gcd k. k is the minimum interval of states before the process can return to a previous state. However intervals of multiples of k are allowed.

But in my example it is not possible to get back to state i in 2 steps, while 2 is the gcd. How does this make '2' the minimum? The minimum is '4'.
 
azay said:
But in my example it is not possible to get back to state i in 2 steps, while 2 is the gcd. How does this make '2' the minimum? The minimum is '4'.

A-> B-> A. Two steps (arrows).
 
SW VandeCarr said:
A-> B-> A. Two steps (arrows).

It's not possible to go from A to B to A in my example.

Say: A->B->C->D->A (4 steps) is possible and A->E->F->...->A (18 steps) is possible. I don't see how it is possible to get back in 2 steps...

What am I missing?
 
azay said:
It's not possible to go from A to B to A in my example.

Say: A->B->C->D->A (4 steps) is possible and A->E->F->...->A (18 steps) is possible. I don't see how it is possible to get back in 2 steps...

What am I missing?

What probabilities are shown in your transition matrix for A->B and A->E? Have you accounted for all other possible transitions in the above scheme? What's the gcd for periods of 4 and 18?
 
azay said:
Hello,

I am trying to understand the intuition of the definition of the period of a state in a Markov chain.

Say for example we can go from state i to state i in either 4 steps or either 18 steps.
gcd(4,18)=2, with gcd=greatest common divisor. So the period of state i is equal to 2.
I find strange in this case because we cannot revisit state i in every multiple of 2. What's the intuition behind this?

Thanks

You can't visit it for every multiple of 2, but it's pretty close. For example:

You can visit it in 36 steps by doing it in 18 steps twice
You can also visit it in 38 steps: do an 18 step loop , and then a 4 step loop 5 times.

Once you can visit it in 36 steps you can visit it in 40 steps by doing the 36 step loop and then a 4 step loop, and in 42 times by doing a 38 step loop then a 4 step loop. Similarly you can visit it in 44 steps, 46 steps, 48 steps, 50 steps, etc.
 
Office_Shredder said:
You can't visit it for every multiple of 2, but it's pretty close. For example:

You can visit it in 36 steps by doing it in 18 steps twice
You can also visit it in 38 steps: do an 18 step loop , and then a 4 step loop 5 times.

Once you can visit it in 36 steps you can visit it in 40 steps by doing the 36 step loop and then a 4 step loop, and in 42 times by doing a 38 step loop then a 4 step loop. Similarly you can visit it in 44 steps, 46 steps, 48 steps, 50 steps, etc.

Yes, I had seen this pattern. I just found kind of quirky because in many places where they give intuition about this definition they say as if you can visit for every multiple of 2.

But thanks for confirming :).
 
SW VandeCarr said:
What probabilities are shown in your transition matrix for A->B and A->E? Have you accounted for all other possible transitions in the above scheme? What's the gcd for periods of 4 and 18?

There are no other possible transitions. Only A->B and A->E are possible. The gcd is 2...
 
  • #10
azay said:
There are no other possible transitions. Only A->B and A->E are possible. The gcd is 2...

Yes, but this is not the typical kind of process that is usually modeled by a Markov chain. Once a system goes from state A to B, it goes back to A with probability one. Likewise, once it goes to E, it goes back to A with probability one. The only divergence (outbranching) occurs between A and B or E. Here you imply probabilities not 0 and not 1, but don't state them. It only matters in terms of which loop the system will inhabit most often. So effectively, if not technically, you have transitions that are reducible to A->B->A and A->E->A, unless you introduce step dependent time factors. Even then, you could load these factors on B and E. In this example, the factors would be 2 and 9 assuming each step gets the same value.
 
Last edited:
  • #11
SW VandeCarr said:
Yes, but this is not the typical kind of process that is usually modeled by a Markov chain. Once a system goes from state A to B, it goes back to A with probability one. Likewise, once it goes to E, it goes back to A with probability one. The only divergence (outbranching) occurs between A and B or E. Here you imply probabilities not 0 and not 1, but don't state them. It only matters in terms of which loop the system will inhabit most often. So effectively, if not technically, you have transitions that are reducible to A->B->A and A->E->A, unless you introduce step dependent time factors. Even then, you could load these factors on B and E. In this example, the factors would be 2 and 9 assuming each step gets the same value.

It was just a theoretical example. Why is it not possible to have probability 1 for all transitions except A->B and A->E? (the sum of the probabilities of A->B and A->E is obviously 1, with both being not equal to 0). There are no other transitions except the ones I listed. It is a valid Markov chain right?
 
  • #12
azay said:
It was just a theoretical example. Why is it not possible to have probability 1 for all transitions except A->B and A->E? (the sum of the probabilities of A->B and A->E is obviously 1, with both being not equal to 0). There are no other transitions except the ones I listed. It is a valid Markov chain right?

Yes.
 
  • #13
SW VandeCarr said:
Yes.

Okay. Thanks for helping me out!
 
Back
Top