# Stationary probabilities.(Markov chain).

1. May 21, 2009

### MathematicalPhysicist

We are given two states 1,2 in an irreducible and positive recurrent Markov chain, and their stationary probabilities $$\pi_1$$ and $$\pi_2$$ respectively, try to characterise in general the probability (distribution) of the number of visits in state 2 after two consecutive visits in state 1.

Any hints?

2. May 21, 2009

### Enuma_Elish

Write out the ways this can happen, then turn it into a formula:

1,1,2
2,1,1,2
2,2,1,1,2
1,2,1,1,2
...

3. May 22, 2009

### MathematicalPhysicist

Yes I thought in this direction but not sure how to get to the formula.
I mean for 1,1,2 the probability is: $$P_{1,1}\frac{\pi_1}{\pi_1+\pi_2}P_{1,2}$$
2,1,1,2 $$\frac{\pi_2}{\pi_1+\pi_2}P_{1,1}P_{2,1}P_{1,2}$$
1,2,1,1,2 $$\frac{\pi_1}{\pi_1+\pi_2}P^2_{1,2}P_{1,1}P_{2,1}$$

So my hunch is that if we first get to 1 or 2, we should multiply by pi1/(pi1+pi2) or pi2/(pi1+pi2), and we should always multiply by P1,1, but other than this I don't see a general equation for all cases.

4. May 22, 2009

### Enuma_Elish

I don't have an immediate answer, but you might find the approach in this paper to be helpful:

http://smu.edu/statistics/TechReports/TR211.pdf [Broken]

The authors derive the unconditional distribution of the number of successes (e.g., state 2) in n+1 trials. It seems to me that you need to derive a similar distribution, conditional on having obtained (exactly? or at least?) two consecutive failures (1,1).

Last edited by a moderator: May 4, 2017