[markov chain] reading expected value from the transition matrix

Click For Summary
The discussion revolves around calculating the expected time T for a Markov chain to reach state e_n for the first time, starting from state e_1, using a transition matrix. A suggestion is made to modify the matrix so that the chain remains in state e_n once it reaches there. The expected value can be computed using the formula involving the transition matrix M, specifically by evaluating the sum of k times the difference between i M^k and i M^(k-1). The initial state vector i is defined as (1, 0, ..., 0), focusing on the last component for the expected value. The conversation highlights the complexity of the problem despite its initial perception as trivial.
rahl___
Messages
10
Reaction score
0
Hello there,

yet another trivial problem:
We have a transition matrix of some markov chain: \left[\begin{array}{ccc}e_{11}&...&e_{1n}\\...&...&...\\e_{n1}&...&e_{nn}\end{array}\right].
at the beginning our chain is in the state e_1. let T be the moment, when the chain reaches e_n for the first time. What is the expected value of T?

I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume.

thanks for your help,
rahl.
 
Physics news on Phys.org
I don't think there's an easy answer to that. You can modify the matrix so, the chain will remain in state e_n if it gets there, and compute \sum_{k=1}^\infty k (i M^k - i M^{k-1})

where i is the initial state of (1, 0, ... , 0) and M the transition matrix. You need the last component of this of course.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
2K
Replies
1
Views
3K
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
6K
  • · Replies 2 ·
Replies
2
Views
14K
  • · Replies 1 ·
Replies
1
Views
5K