Persistence of State i in a Markov Chain

Click For Summary
SUMMARY

A state i in a Markov chain is defined as persistent if the mean number of visits to state i is infinite, provided the chain starts in state i. This conclusion is derived from the relationship between persistence and mean recurrence time. Specifically, if the mean recurrence time is finite, the state cannot be persistent. Therefore, understanding the implications of these definitions is crucial for analyzing Markov chains effectively.

PREREQUISITES
  • Understanding of Markov chains and their properties
  • Familiarity with the concept of mean recurrence time
  • Knowledge of probability theory and stochastic processes
  • Basic mathematical skills for analyzing sequences and limits
NEXT STEPS
  • Study the definition and properties of persistent states in Markov chains
  • Explore the concept of mean recurrence time in detail
  • Learn about the implications of infinite mean visits in stochastic processes
  • Investigate examples of Markov chains with persistent and transient states
USEFUL FOR

Mathematicians, statisticians, and data scientists interested in stochastic processes, particularly those analyzing Markov chains and their long-term behavior.

xentity1x
Messages
9
Reaction score
0
In a Markov chain, show that a state i is persistent if and only if the mean number of visits to the state i is infinite given the chain started in state i.

I thought about looking at the mean recurrence time, but that's all I have so far.
 
Physics news on Phys.org


start with the definition of persistent
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
24
Views
4K
  • · Replies 29 ·
Replies
29
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K