Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov chain

  1. Apr 7, 2005 #1
    Let [tex]\left( {X_n } \right)_{n \ge 0}[/tex] be a Markov chain (discrete time).

    I have

    [tex]{\bf{P}} = \left[ {pij} \right]_{i,j} = \left[ {P\left( {X_1 = j|X_0 = i} \right)} \right]_{i,j}[/tex],

    and the initial probability distribution [tex]{\bf{p}}^{\left( 0 \right)}[/tex].

    I need to calculate

    [tex]P\left( {\mathop \cup \limits_{i = 0}^8 X_i = 3} \right)[/tex]

    I can use Matlab for the numerical calculations but I need to find an expression for this probability. The only expression I have been able to find consists of too many terms to be considered reasonable.
    Any suggestions on how to find an expression, that at least in some way is repetitive so that the numerical calculations can be done by the computer, would be appreciated.
  2. jcsd
  3. Apr 7, 2005 #2
    Hehe, nevermind, I figuered it out. If the state 3 is turned into a absorbing state, then:

    [tex]P\left( {\bigcup\limits_{i = 0}^8 {X_i = 3} } \right) = 1 - P\left( {\bigcup\limits_{i = 0}^8 {X_i \ne 3} } \right) = 1 - \left( {1 - P\left( {X_8 = 3} \right)} \right) = P\left( {X_8 = 3} \right)[/tex]
    Last edited: Apr 7, 2005
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook