Find all Invariant Probability Measures for P (Markov Chain)

In summary: The three possible invariant probability measures are [1/2, 0, 0, 0, 1/2] when starting in 1 or 5, [0, 0, 1, 0, 0] when starting in 3, and [1/4, 0, 1/2, 0, 1/4] when starting in 2 or 4.
  • #1
RoshanBBQ
280
0

Homework Statement


Find all Invariant Probability Measures for P (Markov Chain)
E = {1,2,3,4,5}

The screenshot below has P and my attempted solution. I am wondering if it acceptable to have infinitely many answers ("all" seems to indicate that is acceptable). Basically, I had too many unknowns and too few equations. Does my work look right?
http://i.minus.com/ioEfrJKWamvpR.JPG

edit: I should add also that pi_1 >= 0 and pi_1 <= 1/2 to force the probabilities to be between 0 and 1. And after a bit of thought, I feel more comfortable with my answer. It seems to embody the fact that if I start at state 3, I stay there forever. Hence, pi_1 = 0 => steady-state probabilities are [0,0,1,0,0]. I can also be stuck in the recurrent class {1,5} in which case pi_1 = 1/2, and I never reach 3. I.e. pi = [1/2,0,0,0,1/2]. And there are many possibilities in between if I start in state 4 etc. where I have a finite probability of being sucked into the recurrent class or into the absorbing class. So pi = [nonzero, 0, nonzero, 0, nonzero] which the equation can handle.

edit2: Upon further thought, though, it seems there should only be 3 invariant probability measures. One if I start in state 3, one if I start in state one or five, one if I start in state 2 or 4 (since 2 a.s. goes to 4).

I believe the answers are:
start in 1 or 5: [1/2 0 0 0 1/2] (so pi_1 = 1/2)
start in 3: [0 0 1 0 0] (so pi_1 = 0)
start in 2 or 4: [1/4 0 1/2 0 1/4] (so pi_1 = 1/4)
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
A:The answer is correct.For a Markov chain, there can be multiple invariant probability measures, and, in this case, there are infinitely many.Note that the conditions $\pi_1 \ge 0$ and $\pi_1 \le 1/2$ should also be imposed.
 

FAQ: Find all Invariant Probability Measures for P (Markov Chain)

What is a Markov chain?

A Markov chain is a mathematical model that describes a system where the future state of the system depends only on the current state and not on the previous states.

What are invariant probability measures?

Invariant probability measures are probability distributions that remain unchanged over time in a Markov chain. This means that the probability of being in a particular state at any given time remains the same, regardless of the previous states.

How do you find all invariant probability measures for a Markov chain?

To find all invariant probability measures for a Markov chain, you need to solve the system of linear equations that describe the probabilities of transitioning from one state to another. This can be done using matrix algebra or other mathematical methods.

What is the importance of finding invariant probability measures?

Finding invariant probability measures is important because it allows us to understand the long-term behavior of a Markov chain. These measures can help us predict the future states of the system and make informed decisions.

Are there any real-world applications of finding invariant probability measures for Markov chains?

Yes, there are many real-world applications of Markov chains and their invariant probability measures. Some examples include predicting stock prices, analyzing weather patterns, and understanding the spread of diseases.

Back
Top