Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Birth Death Markov Chain

  1. Aug 30, 2010 #1
    Hi

    I am trying to model the behaviour of 2 independent ON-OFF sources. My state diagram is as follows

    state 0 = both sources are OFF
    state 1 = 1 of the sources are ON
    state 2 = both sources are ON

    The transition rates are given as

    BIRTH RATE = lamda(i) = (N-I)*lamda
    DEATH RATE = alpha(i) = i*alpha

    So in my case N = 2.

    I understand how to obtain the steady state distribution and the infinitesimal generation matrix. But I don't know how to obtain the transition probability matrix.

    Reference taken from (see attached file):
    netsys.kaist.ac.kr/~lectures/EE627_2009/material/EE627_1.ppt

    Any help will be appreciated
     

    Attached Files:

  2. jcsd
  3. Aug 31, 2010 #2
    There are just two requirements for a square state transition probability matrix.

    1. All cells must contain a probability

    2. The sum of each row must equal unity.

    The probability that the system stays in the same state can be non-zero.
     
    Last edited: Aug 31, 2010
  4. Sep 5, 2010 #3

    fi1

    User Avatar

    If I understood correctly, you have a 3-state continuous-time Markov chain. You know the transition rates (birth/death rates) so you know the infinitesimal generator matrix (Q), which gives you the probabilities of a transition in a short time interval. I guess you are trying to find the transition probabilities P["State at time t" = j | "State at time 0" = i]. The matrix constructed by these probabilities is given by e^(Qt). (Matrix exponential). Check out the book Markov Chains by Norris, which explains the relation between the infinitesimal generator and the transition probability matrix in the first few pages of Chapter 2.

    It might also help to look at the discrete-time version. I did some search on the web. I found this web site, http://www.utdallas.edu/~jjue/cs6352/markov/node3.html which explains the discrete-time version. In summary, if you had a discrete-time Markov chain, those birth/death rates would correspond to "single-step" transition probabilities. In order to find the transition probabilities from a state to another one within some time, say m, you would look for the "m-step transition matrix" which is equal to m-th power of the "single-step" transition probability matrix.

    I guess the difference between the discrete-time and continuous-time version is that the "m-step transition probability matrix" is the solution of a difference equation in the discrete case, and it is the solution of a differential equation in the continuous case.

    - http://www.utdallas.edu/~jjue/cs6352/markov/node3.html
    - Markov Chains, by J.R. Norris
     
    Last edited: Sep 5, 2010
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook