Finding Stationary Distribution for a Stochastic Process

  • Context: Graduate 
  • Thread starter Thread starter Ardla
  • Start date Start date
  • Tags Tags
    Distribution
Click For Summary

Discussion Overview

The discussion revolves around finding the stationary distribution of a stochastic process defined by the equation X_t = ρX_{t-1} + ε_t, with initial condition X_0 = 0 and the constraint |ρ| < 1. Participants explore the variance of the process and its implications for the stationary distribution, as well as questions regarding the aperiodicity of the associated Markov chain.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant expresses uncertainty about how to find the stationary distribution and presents an attempt involving variance calculations, which they believe is incorrect.
  • Another participant corrects the variance calculations, stating that V(X1) = 1, V(X2) = ρ² + 1, and V(X3) = ρ²(V(X2)) + 1, leading to a limit of V(Xn) as n approaches infinity being 1/(1-ρ²).
  • Questions arise about the aperiodicity of the Markov chain, with one participant asking if the condition V(X1) = 1 implies aperiodicity and seeking clarification on the terms used.
  • A later reply clarifies that a Markov chain is aperiodic if all states are in one class and there is a non-zero probability of returning to the same state, linking this to the concept of irreducibility.
  • Participants discuss the definition of aperiodicity in the context of continuous state spaces and suggest that the Gaussian nature of ε implies aperiodicity.
  • One participant tentatively agrees with the assessment of aperiodicity but expresses uncertainty about the definitions and seeks further confirmation from their lecturer.

Areas of Agreement / Disagreement

Participants generally agree on the form of the stationary distribution as N(0, 1/(1-ρ²), but there is uncertainty regarding the aperiodicity of the Markov chain, with multiple interpretations and no consensus reached.

Contextual Notes

Participants express limitations in their understanding of the definitions related to Markov chains and aperiodicity, indicating a need for further clarification on these concepts.

Who May Find This Useful

Readers interested in stochastic processes, Markov chains, and the mathematical foundations of stationary distributions may find this discussion relevant.

Ardla
Messages
5
Reaction score
0
Hi, can someone please provide some guidance on how i should go about finding the stationary distribution of:

[tex]X_t =[/tex] [tex]\rho X_{t-1} + \epsilon_t[/tex], [tex]X_0 = 0[/tex]and [tex]|\rho|<1[/tex]
where [tex]\epsilon_1, \epsilon_2, \cdots[/tex] are all independent N(0,1)..

i have no idea what to do, so here's my attempt which i know to be completely wrong:
suppose,
[tex]Var(X_1) = \rho \sigma^2 < \infty[/tex]
[tex]Var(X_2) = \rho\sigma^2 + 1[/tex]
[tex]\vdots[/tex]
[tex]Var(X_{n+1}) = \rho\sigma^2 + t[/tex]
As [tex]t \rightarrow \infty, Var(X_{n+1} = \rho \sigma^2 + t[/tex] ?

yeah I am very sure I am not doing it right... Can someone please help me out?
 
Last edited:
Physics news on Phys.org
the first line of your post is garbled. You need to fix it to get a response.
 
mathman said:
the first line of your post is garbled. You need to fix it to get a response.


Sorry i didn't realize, I fixed it now, can you please help?
 
Since X0=0, V(X1)=1.
Next V(X2)=r2+1.
V(X3)=r2(r2+1)+1.
etc. (r=rho).

The limit as n->oo of V(Xn)=1/(1-r2)
 
Last edited:
THank you!

Sorry can i ask another question in this same post? Well is that MC aperiodic too because [tex]V(X_1) = 1[/tex]? Is that the value of d(i)?
 
Last edited:
Ardla said:
THank you!

Sorry can i ask another question in this same post? Well is that MC aperiodic too because [tex]V(X_1) = 1[/tex]? Is that the value of d(i)?
I need to know what your terms mean. "MC aperiodic " - what is MC?, "d(i)" - what is d and what is i?
 
Sorry I meant Markov Chain. Um the d(i) is the period defined as:
[tex]d(i) = gcd[/tex]{[tex]n:p_{ii}(n)>0[/tex]}
i.e.the greatest common divisor
and something is aperiodic if d(i) = 1.

Coz like in my notes it says Markov is aperiodic if it can access all states. But I am not sure about how to show it
 
Ardla said:
Sorry I meant Markov Chain. Um the d(i) is the period defined as:
[tex]d(i) = gcd[/tex]{[tex]n:p_{ii}(n)>0[/tex]}
i.e.the greatest common divisor
and something is aperiodic if d(i) = 1.

Coz like in my notes it says Markov is aperiodic if it can access all states. But I am not sure about how to show it

Hi,
A Markov chain is aperiodic if all the states are in one class (as periodicity is a class property and the chain itself is called aperiodic in your case) and starting from state i, there is a non-zero probability of transition to state i (this is of course given by your definition of d(i)). I think if it can access all states the chain would be called irreducible (because it is a single communicating class and there are no other states so that it is also closed).

About your first question, I just want to confirm whether you are also getting the stationary distribution as N(0,1/1-rho^2).

Now as your state space is continuous I am not really very sure about the definition of periodicity. I guess that a reasonable definition of aperiodicity would be when P(X(1) \in A | X(0) \in A) > 0 for A \in the state space. (Also there would be some restriction on what A could be). Because epsilon is Gaussian I think that the chain is aperiodic.
 
hmm! well i think that the stationary distribution is right..

umm yeahh I am not sure about the aperiodicity too, but i agree i think that its aperiodic. I think that i'll just say coz it can reach any subspace in one step? its ok, i think that i'll get the answer from my lecturer when uni starts again..

but thank you guys!
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K