How to Obtain the Transition Probability Matrix in a Birth Death Markov Chain?

Click For Summary
SUMMARY

This discussion focuses on obtaining the transition probability matrix for a continuous-time Markov chain modeling two independent ON-OFF sources. The transition rates are defined by the birth rate λ(i) = (N-I)λ and death rate α(i) = iα, with N set to 2. The key takeaway is that the transition probabilities can be derived from the infinitesimal generator matrix (Q) using the matrix exponential e^(Qt). For further understanding, the book "Markov Chains" by J.R. Norris is recommended for its clear explanation of the relationship between the infinitesimal generator and the transition probability matrix.

PREREQUISITES
  • Understanding of continuous-time Markov chains
  • Familiarity with infinitesimal generator matrices
  • Knowledge of matrix exponentiation
  • Basic concepts of birth-death processes
NEXT STEPS
  • Study the matrix exponential e^(Qt) in the context of Markov chains
  • Read "Markov Chains" by J.R. Norris, focusing on Chapter 2
  • Explore discrete-time Markov chains and their transition probability matrices
  • Investigate applications of birth-death processes in real-world scenarios
USEFUL FOR

Researchers, mathematicians, and engineers working with stochastic processes, particularly those modeling systems using continuous-time Markov chains.

smoodliar
Messages
4
Reaction score
0
Hi

I am trying to model the behaviour of 2 independent ON-OFF sources. My state diagram is as follows

state 0 = both sources are OFF
state 1 = 1 of the sources are ON
state 2 = both sources are ON

The transition rates are given as

BIRTH RATE = lamda(i) = (N-I)*lamda
DEATH RATE = alpha(i) = i*alpha

So in my case N = 2.

I understand how to obtain the steady state distribution and the infinitesimal generation matrix. But I don't know how to obtain the transition probability matrix.

Reference taken from (see attached file):
netsys.kaist.ac.kr/~lectures/EE627_2009/material/EE627_1.ppt

Any help will be appreciated
 

Attachments

Physics news on Phys.org
smoodliar said:
Hi
I understand how to obtain the steady state distribution and the infinitesimal generation matrix. But I don't know how to obtain the transition probability matrix.

There are just two requirements for a square state transition probability matrix.

1. All cells must contain a probability

2. The sum of each row must equal unity.

The probability that the system stays in the same state can be non-zero.
 
Last edited:
If I understood correctly, you have a 3-state continuous-time Markov chain. You know the transition rates (birth/death rates) so you know the infinitesimal generator matrix (Q), which gives you the probabilities of a transition in a short time interval. I guess you are trying to find the transition probabilities P["State at time t" = j | "State at time 0" = i]. The matrix constructed by these probabilities is given by e^(Qt). (Matrix exponential). Check out the book Markov Chains by Norris, which explains the relation between the infinitesimal generator and the transition probability matrix in the first few pages of Chapter 2.

It might also help to look at the discrete-time version. I did some search on the web. I found this web site, http://www.utdallas.edu/~jjue/cs6352/markov/node3.html which explains the discrete-time version. In summary, if you had a discrete-time Markov chain, those birth/death rates would correspond to "single-step" transition probabilities. In order to find the transition probabilities from a state to another one within some time, say m, you would look for the "m-step transition matrix" which is equal to m-th power of the "single-step" transition probability matrix.

I guess the difference between the discrete-time and continuous-time version is that the "m-step transition probability matrix" is the solution of a difference equation in the discrete case, and it is the solution of a differential equation in the continuous case.

- http://www.utdallas.edu/~jjue/cs6352/markov/node3.html
- Markov Chains, by J.R. Norris
 
Last edited:

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
24
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
5K