Stationary distribution of Markov chain

Click For Summary

Discussion Overview

The discussion revolves around the asymptotic behavior of a Markov chain with an uncountable state space, specifically focusing on the conditions necessary for the chain to converge in distribution to a stationary random variable. Participants explore concepts such as irreducibility, ergodicity, and properties related to continuous state space Markov chains.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Pere seeks to determine if an upper bound for the probability of the Markov chain's state exceeding a certain value is sufficient to conclude convergence to a stationary distribution.
  • Quadrophonics notes that irreducibility typically implies uniqueness of the stationary distribution and mentions the need for ergodicity to ensure convergence to that distribution.
  • Participants discuss the concept of return time in countable chains and speculate on how this might extend to uncountable state spaces, questioning the feasibility of returning to an exact state.
  • Pere expresses uncertainty about the relationship between ergodicity and non-periodicity, indicating a need for further reading.
  • Another participant clarifies that while irreducibility ensures uniqueness of a stationary distribution, it does not guarantee its existence, and they reflect on the implications of ergodicity.
  • Discussion includes properties of continuous state space Markov chains, such as φ-irreducibility and aperiodicity, which are suggested as conditions for asymptotic stationarity.

Areas of Agreement / Disagreement

Participants generally agree on the definitions and implications of irreducibility and ergodicity, but there is no consensus on the existence of a stationary distribution for uncountable state spaces or the specifics of how return times apply in this context.

Contextual Notes

Participants acknowledge limitations in their understanding of uncountable state spaces and the need for further exploration of ergodicity and related properties in this framework.

Pere Callahan
Messages
582
Reaction score
1
Hi all,

I'm given a Markov chain Q_k, k>0 with stationary transition probabilities. The state is space uncountable.
What I want to show is that the chain is asymptotically stationary, that is it converges in distribution to some random variable Q.

All I have at hand is an k-independent upper bound for P(|Q_k|>x) for all x in the state space (and some indecomposability assumptions on the state space).

Is this enough to conclude convergence of the chain?

Thanks for any help.

-Pere
 
Physics news on Phys.org
I don't have much experience with uncountable state-space Markov chains, but the irreducibility conditions will typically imply that any stationary distribution must be unique. Also, you'll need some ergodicity condition to ensure that the chain actually converges to the stationary distribution, even if it isn't necessary to show that the stationary distribution exists.
 
Thanks for your reply quadrophonics.

Any more insights on the topic?
Thanks:smile:
 
Pere Callahan said:
Thanks for your reply quadrophonics.

Any more insights on the topic?
Thanks:smile:

Well, it seems that the basic result should be about checking the "return time" for a given state; in countable chains, there's a necessary and sufficient condition for the existence of a stationary distribution, where the expected return time for every state needs to be finite (and then the value of the stationary distribution for that state is the inverse of the return time). I'm not sure what the uncountable extension of that result would be: it seems that you can't ever expect to get back to *exactly* the same point in the state space (since it's a set of measure 0). Presumably some condition relating to the probability of returning within some \delta of a given point in the state space?
 
Thanks again.

So irreducibility ensures existence of a stationary distribution and ergodicity convergence to this stationary distribution?

Is ergodicity the same as non-periodicity...? I should read some more on the topic.

The theory of Harris chains seems to be a good starting point.

So i found two properties related to continuous state space Markov chains

The first \phi -irreducibility:

A chain is \phi-irreducible if there exists a non-zero \sigma-finite
measure \phi on X such that \forall A \subset X with \phi(A) > 0 and \forall x \in X \exists n(x;A) \in N such that P_n(x;A) > 0.

Here P_n are the n-step transition probabilities.

The second one is a-periodicity, that is there do not exist subsets X_1 , ... , X_d of the state space such that the chain jumps with probability one from X_1 to X_2, from X-2 to X_3 .. and from X_d back to X_1.

Under these two conditions, the Markov chain is (reportedly) asymptotically stationary.

So I have to think about it ..:smile:
 
Last edited:
Pere Callahan said:
So irreducibility ensures existence of a stationary distribution and ergodicity convergence to this stationary distribution?

Well, I think all that irreducibility buys you is the assurance that IF there is a stationary distribution, it's unique. If you have a reducible chain, there are non-communicating subsets of the state space, each of which can have their own stationary distribution. Irreducibility ensures that this can't happen, but doesn't guarantee that there is a stationary distribution. I'm trying to think of a counterexample, but having trouble... it may be that for finite state spaces you also get the existence of the stationary distribution, but I have some recollection it's not generally the case.

But, yeah, ergodicity guarantees that you eventually visit every state, and so will eventually fall into the stationary distribution.

Pere Callahan said:
Is ergodicity the same as non-periodicity...?

I believe it implies both aperiodicity and "positive recurrence," which means that there a positive probability of returning to any state after you visit it once. Contrast this with a "transient" state, where you might start out but never return to.

And, yeah, I believe that Harris Chains is the right term to search for. If you can find any good references, let me know.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K