1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: [markov chain] reading expected value from the transition matrix

  1. Dec 29, 2009 #1
    Hello there,

    yet another trivial problem:
    I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume.

    thanks for your help,
    rahl.
     
  2. jcsd
  3. Dec 29, 2009 #2
    I don't think there's an easy answer to that. You can modify the matrix so, the chain will remain in state [itex] e_n [/itex] if it gets there, and compute [tex] \sum_{k=1}^\infty k (i M^k - i M^{k-1})[/tex]

    where i is the initial state of (1, 0, ..... , 0) and M the transition matrix. You need the last component of this of course.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook