Show all other states are transient in Markov chain

In summary, the conversation discusses the proof that all states in a Markov chain with an absorbing state s are transient, except for the absorbing state itself. This is shown by raising the transition matrix to the nth power and observing that the probability of transitioning to any state other than s approaches 0 as n approaches infinity. The conversation suggests looking at the form of P^2, P^3, and so on to generalize the proof, keeping in mind that each row of the matrices must sum to one.
  • #1
xentity1x
12
0
Let X be a Markov chain with a state s that is absorbing, i.e. pss(1) = 1. All other states
communicate with s i.e. i → s for all states i ∈ S. Show that all states in S except s are
transient.

I understand this intuitively, but I'm not really sure how to start the proof.
 
Physics news on Phys.org
  • #2
write down a transition matrix (which can be reasonably arbitrary, make the absorbing state the first for simplicity) and consider the applying it again and again, and think what happens in the limit...
 
  • #3
Ok so I wrote down the transition matrix with s=1.
\begin{bmatrix}
\begin{array}{cc}
1 & 0 & ... & 0 \\
p_{21} & p_{22} & ... & p_{2N} \\
\vdots & \vdots & \ddots & \vdots \\
p_{N1} & p_{N2} & ... & p_{NN} \\
\end{array}
\end{bmatrix}

I then raised it to the nth power
\begin{bmatrix}
\begin{array}{cc}
1 & 0 & ... & 0 \\
p_{21}(n) & p_{22}(n) & ... & p_{2N}(n) \\
\vdots & \vdots & \ddots & \vdots \\
p_{N1}(n) & p_{N2}(n) & ... & p_{NN}(n) \\
\end{array}
\end{bmatrix}

So I guess I have to show [tex] \lim_{n\rightarrow \infty} p_{ij}(n)=0 \hspace{3mm} \forall i,j>1[/tex]

I'm not really to sure where to go from there.
 
  • #4
not quite [tex] p_{ij}^{(n)} [/tex]will represent the probability of transitioning from state i at time 0, to state j at time n. So you would expect
[tex] p_{ij}^{(n)} = 1 / / i=1[/tex]
[tex] p_{ij}^{(n)} = 0 / / i \neq 1[/tex]I would start by looking at P^2, P^3 to get a feeling for the form and what happens and try and generalise from there

keep in mind each row of P, P^2 and so forth must sum to one...
 

FAQ: Show all other states are transient in Markov chain

1. What is a Markov chain?

A Markov chain is a mathematical model that describes a sequence of events in which the probability of each event depends only on the state of the previous event. It is commonly used to model systems in which the future state is dependent only on the current state and not on the previous history of states.

2. What does it mean for a state to be transient in a Markov chain?

A transient state in a Markov chain is a state that can be reached from other states, but eventually the chain will leave that state and never return. In other words, the probability of returning to a transient state after a certain number of steps is always zero.

3. How can we show that all other states are transient in a Markov chain?

We can show that all other states in a Markov chain are transient by using mathematical proofs and calculations. This involves analyzing the transition probabilities between states and determining if they eventually lead to a state that is recurrent (will be visited infinitely many times) or transient (will only be visited a finite number of times).

4. Why is it important to know which states are transient in a Markov chain?

Knowing which states are transient in a Markov chain is important because it helps us understand the behavior and long-term outcomes of the system being modeled. It can also help us make predictions and decisions based on the probabilities of transitioning between states.

5. Can a state be both transient and recurrent in a Markov chain?

No, a state cannot be both transient and recurrent in a Markov chain. A state is either transient or recurrent, and this is determined by the transition probabilities between states. If the probability of returning to a state is zero, then it is transient. If the probability is greater than zero, it is recurrent.

Back
Top