user366312
Gold Member
 63
 2
 Problem Statement

If ##(X_n)_{n≥0}## is a Markov chain on ##S = \{1, 2, 3\}## with initial distribution ##α = (1/2, 1/2, 0)## and transition matrix
## \begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0 \end{bmatrix},##
then ##P(X_3 = 1X_1 = 2) = ?##.
 Relevant Equations
 Markov Chain
##P^2=\begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0 \end{bmatrix} \begin{bmatrix} 1/2&0&1/2\\ 0&1/2&1/2\\ 1/2&1/2&0 \end{bmatrix}=
\begin{bmatrix}
1/2 & 1/4 & 1/4\\
1/4 & 1/2 & 1/4\\
1/4 & 1/4 & 1/2
\end{bmatrix}##
So, ##P(X_3 = 1X_1 = 2) = 1/4##.
Is this solution correct?
\begin{bmatrix}
1/2 & 1/4 & 1/4\\
1/4 & 1/2 & 1/4\\
1/4 & 1/4 & 1/2
\end{bmatrix}##
So, ##P(X_3 = 1X_1 = 2) = 1/4##.
Is this solution correct?