Concept question on irreducibility (markov chain)

Click For Summary
SUMMARY

This discussion focuses on the irreducibility of Markov chains, specifically examining two scenarios with state space S = {0,1,2...} and transition probabilities defined by p(i,i+1)=q and p(i,i-1)=1-q for the first case, and p(i,i+1)=q and p(i,0)=1-q for the second. The first Markov chain is confirmed to be irreducible as all states can be reached from any other state. In the second case, the presence of a transition to state 0 raises questions about the chain's irreducibility, as it may affect the ability to return to higher states from state 0.

PREREQUISITES
  • Understanding of Markov chains and their properties
  • Familiarity with transition probability notation
  • Knowledge of irreducibility in stochastic processes
  • Basic concepts of state space in probability theory
NEXT STEPS
  • Study the concept of irreducibility in Markov chains
  • Learn about transition matrices and their implications
  • Explore the concept of recurrent and transient states
  • Investigate examples of Markov chains with absorbing states
USEFUL FOR

Students and researchers in probability theory, mathematicians studying stochastic processes, and anyone interested in the behavior of Markov chains in various applications.

hsinyihhsu
Messages
1
Reaction score
0

Homework Statement


This is not really a homework question but just something that I'm confused about.

I'm having trouble with understanding the movement of Markov chain.
Say, if S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,i-1)=1-q. I can see how this Markov chain moves (back and forth, basically; going one step at a time from time t to time t+1 with probability of either q or 1-q). I can see that this Markov chain is irreducible, since if i<j, then p(i,j) = q^(j-i)>0 and p(j,i) = 1-q^(j-1)>0 (by the way, is this explanation right?)

However, what about for the Markov chain that looks like this:
S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,0)=1-q? How does it move? for p(i,i+1)=q it is going incrementally one step at a time, but how about p(i,0)=1-q? Does it skip all the way back to 0? How can I then tell whether if this Markov chain is irreducible or not?

Any help would be appreciated. Thanks in advance!
 
Physics news on Phys.org
hsinyihhsu said:

Homework Statement


This is not really a homework question but just something that I'm confused about.

I'm having trouble with understanding the movement of Markov chain.
Say, if S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,i-1)=1-q. I can see how this Markov chain moves (back and forth, basically; going one step at a time from time t to time t+1 with probability of either q or 1-q). I can see that this Markov chain is irreducible, since if i<j, then p(i,j) = q^(j-i)>0 and p(j,i) = 1-q^(j-1)>0 (by the way, is this explanation right?)

However, what about for the Markov chain that looks like this:
S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,0)=1-q? How does it move? for p(i,i+1)=q it is going incrementally one step at a time, but how about p(i,0)=1-q? Does it skip all the way back to 0? How can I then tell whether if this Markov chain is irreducible or not?

Any help would be appreciated. Thanks in advance!

Starting from state 0, is possible to reach state i > 1 for any given i (that is, is there a non-zero probability of reaching state i eventually)? Starting from state 0, is it possible to return to state 0 (at some finite time in the future)? If your answer is YES to both these questions, what would that tell you about the markov chain?

RGV
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
9
Views
3K