# [markov chain] reading expected value from the transition matrix

1. Dec 29, 2009

### rahl___

Hello there,

yet another trivial problem:
I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume.

thanks for your help,
rahl.

2. Dec 29, 2009

### willem2

I don't think there's an easy answer to that. You can modify the matrix so, the chain will remain in state $e_n$ if it gets there, and compute $$\sum_{k=1}^\infty k (i M^k - i M^{k-1})$$

where i is the initial state of (1, 0, ..... , 0) and M the transition matrix. You need the last component of this of course.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook