(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Prove the following theorem by induction:

Let P be the transition matrix of a Markov chain. The ijth entry p^{(n)}_{ij}of the matrix P^{n}gives the probability that the Markov chain, starting in state s_{i}, will be in state s_{j}after n steps.

2. Relevant equations

p^{(2)}_{ij}= [itex]\sum^{r}_{k=1}[/itex]p_{ik}p_{kj}

(where r is the number of states in the Markov chain and P is the square matrix with ik being the probability of transitioning from i to j)

3. The attempt at a solution

assume that

p[itex]^{(n)}_{ij}[/itex] = [itex]\sum^{r}_{k=1}[/itex]p[itex]^{(n-1)}_{ik}[/itex]p[itex]^{(n-1)}_{kj}[/itex]

then p^{n+1}must be:

p[itex]^{(n+1)}_{ij}[/itex] = [itex]\sum^{r}_{k=1}[/itex]p[itex]^{(n +1 - 1)}_{ik}[/itex]p[itex]^{(n + 1 -1)}_{kj}[/itex]

that's all I've come up with but it doesn't convince me very much

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Powers of Markov transition matrices

**Physics Forums | Science Articles, Homework Help, Discussion**