• Support PF! Buy your school textbooks, materials and every day products Here!

Probability Transition Matrix and Markov Chains

  • Thread starter cse63146
  • Start date
  • #1
452
0

Homework Statement



Given a Probability transition matrix, starting in X0= 1, determine the probability that the process never reaches state 2.

Homework Equations





The Attempt at a Solution


State 2 is not an observing state, so I'm not sure how to find this probability.

Any help would be greately appreciated.
 

Answers and Replies

  • #2
lanedance
Homework Helper
3,304
2
is this for a specific transition matrix?
 
  • #3
chiro
Science Advisor
4,790
131
Have you considered calculating the steady-state version of the matrix and using that to calculate your final probabilities?
 
  • #4
452
0
is this for a specific transition matrix?
Yes, it is. S = {0,1,2,3}, and the matrix is

[tex]\begin{bmatrix}1 & 0 & 0 & 0 \\ 0.1 & 0.2 & 0.5& 0.2 \\ 0.1 & 0.2 & 0.6 & 0.1 \\ 0.2 & 0.2 & 0.3 & 0.3\end{bmatrix}[/tex]
where the columns are (0,1,2,3) and the rows are (0,1,2,3)'.

Do you know how to find the probability that it never reaches state 2?
 
  • #5
lanedance
Homework Helper
3,304
2
first note [1,0,0,0] is an absorbing state, so there is no chance of trasitioning out of the state

as we are only interested in whether the state ever tranistions through state 2, why not change the matrix so whenever a state is in S=2 it is trapped there
[tex]\begin{bmatrix}1 & 0 & 0 & 0 \\ 0.1 & 0.2 & 0.5& 0.2 \\ 0 & 0 & 1 & 0 \\ 0.2 & 0.2 & 0.3 & 0.3\end{bmatrix}[/tex]

now consider what happens to the vector [0,1,0,0] with repated application of the bove matrix... it will either eventually transition to S=0 and be trapped there having never been through S=2, or it will pass through S=2 and be trapped there, simulating the end of a chain when a state reached S=2
 
Last edited:
  • #6
452
0
I did it somewhat different. Let T = Never reach State 2

[tex]U_i = P(T|X_0 = i)[/tex] for i = 0,1,2,3

U_1 = 1 and U_2 = 0. This would just leave U_1 and U_3 (two unknowns) and also 2 equations. I would solve for both of them, and U_1 would give me the desired probability.

Would that approach work as well?
 

Related Threads on Probability Transition Matrix and Markov Chains

Replies
3
Views
2K
Replies
0
Views
2K
  • Last Post
Replies
3
Views
4K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
4
Views
3K
Replies
3
Views
5K
Replies
2
Views
4K
Replies
0
Views
245
Replies
3
Views
2K
  • Last Post
Replies
1
Views
257
Top