Markov chain chance one state is reached before another

Click For Summary
SUMMARY

The discussion centers on calculating the probability of reaching state 1 before state 4 in a Markov chain with four states and a specified transition matrix. The transition matrix P is given as: P = \begin{pmatrix} 0 & \frac{1}{2}& \frac{1}{2} & 0 \\ 0 & 0 & \frac{1}{2} & \frac{1}{2}\\ \frac{1}{2} & 0 & 0 & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} & 0 & 0 \end{pmatrix}. The solution involves defining a vector f that satisfies the equation P'f = f, where states 1 and 4 are treated as absorbing states. The final probabilities calculated are x = 1/4 for starting from state 3 and y = 1/2 for starting from state 2.

PREREQUISITES
  • Understanding of Markov chains and their transition matrices.
  • Familiarity with absorbing states in Markov processes.
  • Knowledge of linear algebra, specifically solving systems of equations.
  • Ability to interpret and manipulate probability vectors.
NEXT STEPS
  • Study the concept of absorbing Markov chains and their properties.
  • Learn how to derive transition matrices from given Markov chain scenarios.
  • Explore the method of solving linear equations in the context of Markov processes.
  • Investigate the application of Markov chains in real-world scenarios, such as queueing theory.
USEFUL FOR

Students and professionals in mathematics, statistics, and data science who are working with Markov chains, particularly those interested in probability theory and stochastic processes.

kasperrepsak
Messages
31
Reaction score
0
Hey could someone explain why this is true? I am trying to understand how to solve such a problem but I don't understand the solution.

Problem:
Given a Markov chain \left\{X_{n}: n\in\ \mathbb{N}\right\} with four states 1,2,3,4 and transition matrix
P =<br /> \begin{pmatrix}<br /> 0 &amp; \frac{1}{2}&amp; \frac{1}{2} &amp; 0 \\<br /> 0 &amp; 0 &amp; \frac{1}{2} &amp; \frac{1}{2}\\<br /> \frac{1}{2} &amp; 0 &amp; 0 &amp; \frac{1}{2} \\<br /> \frac{1}{2} &amp; \frac{1}{2} &amp; 0 &amp; 0<br /> \end{pmatrix}
We leave from state 3. What is the chance that state 1 will be reached before state 4?

Solution:
Lets call this chance x. Let's call y the same chance but instead leaving from 2. Let's call:
f =<br /> \begin{pmatrix}<br /> 1 \\<br /> x \\<br /> y \\<br /> 0<br /> \end{pmatrix}
Then f satisfies P'f = f with P' the transition matrix derived from P by making state 1 and 4 absorbing states i.e.,

<br /> \begin{pmatrix}<br /> 1 &amp; 0&amp; 0 &amp; 0 \\<br /> 0 &amp; 0 &amp; \frac{1}{2} &amp; \frac{1}{2}\\<br /> \frac{1}{2} &amp; 0 &amp; 0 &amp; \frac{1}{2} \\<br /> 0 &amp; 0 &amp; 0 &amp; 1\end{pmatrix}*\begin{pmatrix}<br /> 1 \\<br /> x \\<br /> y \\<br /> 0<br /> \end{pmatrix}= \begin{pmatrix}<br /> 1 \\<br /> x \\<br /> y \\<br /> 0<br /> \end{pmatrix}
Solving gives y = 1/2, x = 1/4.
 
Last edited:
Physics news on Phys.org
kasperrepsak said:
Hey could someone explain why this is true? I am trying to understand how to solve such a problem but I don't understand the solution.

Problem:
Given a Markov chain \left\{X_{n}: n\in\ \mathbb{N}\right\} with four states 1,2,3,4 and transition matrix
P =<br /> \begin{pmatrix}<br /> 0 &amp; \frac{1}{2}&amp; \frac{1}{2} &amp; 0 \\<br /> 0 &amp; 0 &amp; \frac{1}{2} &amp; \frac{1}{2}\\<br /> \frac{1}{2} &amp; 0 &amp; 0 &amp; \frac{1}{2} \\<br /> \frac{1}{2} &amp; \frac{1}{2} &amp; 0 &amp; 0<br /> \end{pmatrix}
We leave from state 3. What is the chance that state 1 will be reached before state 4?

Solution:
Lets call this chance x. Let's call y the same chance but instead leaving from 2. Let's call:
f =<br /> \begin{pmatrix}<br /> 1 \\<br /> x \\<br /> y \\<br /> 0<br /> \end{pmatrix}
Then f satisfies P'f = f with P' the transition matrix derived from P by making state 1 and 4 absorbing states i.e.,

<br /> \begin{pmatrix}<br /> 1 &amp; 0&amp; 0 &amp; 0 \\<br /> 0 &amp; 0 &amp; \frac{1}{2} &amp; \frac{1}{2}\\<br /> \frac{1}{2} &amp; 0 &amp; 0 &amp; \frac{1}{2} \\<br /> 0 &amp; 0 &amp; 1 &amp; 1\end{pmatrix}*\begin{pmatrix}<br /> 1 \\<br /> x \\<br /> y \\<br /> 0<br /> \end{pmatrix}= \begin{pmatrix}<br /> 1 \\<br /> x \\<br /> y \\<br /> 0<br /> \end{pmatrix}
Solving gives y = 1/2, x = 1/4.

You need to be more specific: what part do you not understand?

What you presented above is pretty standard, and is almost always part of the course material regarding Markov chains.

Also: for context: are you using a textbook and/or course notes? Do these sources not have such material in them?
 
If you could direct me to a place where this is explained I would be grateful. The material my teacher provided doesn't address this problem or atleast not directly, so I find it hard to understand this solution from the theory provided. I don't understand why P'f=f must be true. And I don't understand why those states are made absorbing.

Edit: I don't understand why its necessary to multiply with such a vector either.
 
Last edited:
kasperrepsak said:
If you could direct me to a place where this is explained I would be grateful. The material my teacher provided doesn't address this problem or atleast not directly, so I find it hard to understand this solution from the theory provided. I don't understand why P'f=f must be true. And I don't understand why those states are made absorbing.

Edit: I don't understand why its necessary to multiply with such a vector either.

When you don't understand something, it is a good idea to work it out from first principles. Let's do that.

Define ##f_i(n)## to be the probability that, starting from state i we reach state 1 for the first time at time n and do not reach state 4 before time n; let ##f_i = \sum_{n=1}^{\infty} f_i(n)##; this is the probability we reach state 1 before reaching state 4, starting from state i.

Note that
f_i(n) = P\{X_1 \neq 1,4, \: X_2 \neq 1,4,\: \ldots, X_{n-1} \neq 1,4, \: X_n = 4 |X_0=i\}
Conditioning on ##X_1## we have
f_i(n) = \sum_{j \neq 1,4} p_{ij} P\{X_2 \neq 1,4, \: \ldots, X_{n-1} \neq 1,4,\: X_n = 1 |X_1 = j, X_0 = i \} \\<br /> = \sum_{j \neq 1,4} p_{ij} f_j(n-1).
Also: ##f_i(1) = p_{i1}##. Summing over n we get
f_i = f_i(1) + \sum_{n=2}^{\infty} f_i(n)\\<br /> = p_{ij} + \sum_{j \neq 1,4} p_{ij} \sum_{n=2}^{\infty} f_j(n-1) <br /> = p_{i1} + \sum_{j \neq 1,4} p_{ij} f_j .

In other words:
f_2 = p_{21} + p_{22} f_2 + p_{23} f_3 \\<br /> f_3 = p_{31} + p_{32} f_2 + p_{33} f_3.
Note that this is a simple 2x2 linear system that we can solve easily.

Of course, since (in this example, but not all examples) we either reach state 1 before state 4 or reach state 4 before state 1, the probability that, starting from state i we reach 4 before 1 is just ## 1 - f_i##.

Note that if we set ##f_1 = 1## and ##f_4 = 0## we can write the equations above as
f_1 = 1 f_1 + 0 f_2 + 0 f_3 + 0 f_4 \\<br /> f_2 = p_{21} f_1 + p_{22}f_2 + p_{23} f_3 + p_{24}f_4\\<br /> f_3 = p_{31} f_1 + p_{32}f_2 + p_{33}f_3 + p_{34} f_4 \\<br /> f_4 = 0 f_1 + 0 f_2 + 0 f_3 + 1 f_4
This is just the set of equations ##f = \tilde{P} f## for a modified Markov chain with transition matrix
\tilde{P} = \pmatrix{1 &amp;0&amp;0&amp;0\\p_{21}&amp;p_{22} &amp; p_{23} &amp; p_{24}\\<br /> p_{31} &amp; p_{32} &amp; p_{33} &amp; p_{34} \\<br /> 0 &amp; 0 &amp; 0 &amp; 1}
and with
f = \pmatrix{1 \\f_2 \\f_3 \\0}

As I said: work it out in detail from first principles.
 
  • Like
Likes   Reactions: 1 person
Thank you very much for taking your time and explaining this neatly and thoroughly. I'm sure this will be helpful to many others too as I haven't found an explanation of this particular problem trough Google.
 
kasperrepsak said:
Thank you very much for taking your time and explaining this neatly and thoroughly. I'm sure this will be helpful to many others too as I haven't found an explanation of this particular problem trough Google.

There was a typo in my post, which I corrected in an "edit" and which was seemingly accepted, but was later not implemented. I should have written ##X_n = 1## instead of ##X_n = 4## n the definition of ##f_i(n)##.
 
Ray Vickson said:
There was a typo in my post, which I corrected in an "edit" and which was seemingly accepted, but was later not implemented. I should have written ##X_n = 1## instead of ##X_n = 4## n the definition of ##f_i(n)##.
Yes I realized that. What about the solution? Doesn't it also contain a mistake? If x is the chance leaving from state 3 shouldn't it come in the third place of the vector instead of the second?
 
Ray Vickson said:
There was a typo in my post, which I corrected in an "edit" and which was seemingly accepted, but was later not implemented. I should have written ##X_n = 1## instead of ##X_n = 4## n the definition of ##f_i(n)##.
I'm sorry but as I know sometimes replies are overlooked I'm sending another reply with the hope you will respond.
 
kasperrepsak said:
I'm sorry but as I know sometimes replies are overlooked I'm sending another reply with the hope you will respond.

If you are asking me to check whether a given pair of numbers is the solution of a given pair of two simple linear equations in two unknowns, then I refuse to answer. You can check it yourself---and you always should, as a matter of habit, in such cases. Just plug in the numbers to see if they work. If they don't work, somebody has made an error---typographical or otherwise---and you need to re-solve those equations.
 
  • #10
Ray Vickson said:
If you are asking me to check whether a given pair of numbers is the solution of a given pair of two simple linear equations in two unknowns, then I refuse to answer. You can check it yourself---and you always should, as a matter of habit, in such cases. Just plug in the numbers to see if they work. If they don't work, somebody has made an error---typographical or otherwise---and you need to re-solve those equations.

Well the pair is a solution to that matrix vector multiplication. But I'm pretty sure the teacher made a mistake in putting the chance x on the second place of the vector, since the way u explained it it should be on the third place. This is because x is the chance leaving from state 3. I just wanted to make sure I understood that correctly.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
9
Views
2K
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K