Question about transition matrix of Markov chain

In summary, there is a debate over which convention to use when multiplying a transition matrix with a vector in a Markov chain. Some sources state that the row part represents the current state and the column part represents the future state, while others state the opposite. This convention does not affect the final result as long as it is consistent within the work. Additionally, there is a related issue of using an adjoint method in Markov processes, where the measurement vector is advanced backwards by the transpose of the transition matrix and then the inner product is taken at the starting time. Further research is needed on the use of adjoint methods in Markov processes.
  • #1
songoku
2,294
325
TL;DR Summary
Transition matrix is matrix that shows the probability of going into future state from a certain current state
The note I get from the teacher states that for transition matrix, the column part will be current state and the row part will be future state (let this be matrix A) so the sum of each column must be equal to 1. But I read from another source, the row part is the current state and the column part is the future state (let this be matrix B) so the sum of row is equal to 1. Matrix B is transpose of matrix A but when I try to multiply each of them with other matrix (matrix of the current value of observation), I get different results

https://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf

That link states that the row part is the current state and the column part is the future state

https://www.math.ucdavis.edu/~dadde...tions/MarkovChain/MarkovChain_9_18/node1.html

The second link states that the column part will be current state and the row part will be future state

So which one is correct, matrix A or matrix B? Or maybe I am missing something?

Thanks
 
Physics news on Phys.org
  • #2
It's a matter of convention. Do you prefer to multiply the transition matrix with a column vector on the right or with a row vector on the left?
It's unfortunate that there are places where people didn't agree on a single convention, but as long as you keep the convention consistent within your work it will give the right result.
 
  • Like
Likes Klystron, songoku and Dale
  • #3
mfb said:
It's a matter of convention. Do you prefer to multiply the transition matrix with a column vector on the right or with a row vector on the left?
Oh I see. I prefer to multiply the transition matrix with a column vector on the right side of transition matrix so the one I use should be matrix A, correct?

Thanks
 
  • Like
Likes Dale
  • #5
Thank you very much mfb
 
  • Like
Likes Klystron
  • #6
This brings up a related issue. One can iterate a Markov chain $p(i,t+1)=\sum_j T_{i,j) p(j,t)$ from $t=0$ to $t=N$, i.e. in vector form $p(t+1)=T p(t)$ and then make the measurement $c=(q(N),p(N))$, where $(\cdot , \cdot)$ is the $l^2$ inner product. Or you could advance the measurement vector q(N) \emph{backwards} by the transpose $T^T$ of the transition matrix $T$, and then take the inner product at $t=0$. This is a basic adjoint method. Have such adjoint methods been used in Markov processes?
 

1. What is a transition matrix of Markov chain?

A transition matrix of Markov chain is a square matrix that represents the probabilities of transitioning from one state to another in a Markov chain. It is used to model and analyze systems that have a finite set of possible states and where the future state depends only on the current state.

2. How is a transition matrix of Markov chain constructed?

A transition matrix of Markov chain is constructed by assigning the probabilities of transitioning from one state to another in each row and column of the matrix. The sum of each row must equal to 1, as the system must transition to one of the possible states.

3. What is the significance of a transition matrix in a Markov chain?

The transition matrix is a fundamental tool in analyzing and understanding the behavior of a Markov chain. It allows us to calculate the probability of being in a certain state after a certain number of steps, and to make predictions about the future behavior of the system.

4. Can a transition matrix of Markov chain be used for continuous-time systems?

No, a transition matrix of Markov chain is only applicable for discrete-time systems. For continuous-time systems, a transition rate matrix is used instead, which represents the probabilities of transitioning from one state to another in a continuous time interval.

5. How is a transition matrix of Markov chain used in real-world applications?

A transition matrix of Markov chain is commonly used in various fields such as finance, biology, and engineering to model and analyze systems that have a finite set of possible states. It is used to make predictions about the future behavior of these systems, and to identify optimal strategies for decision-making.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
970
  • Precalculus Mathematics Homework Help
Replies
24
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
971
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
20
Views
4K
Replies
1
Views
790
  • Engineering and Comp Sci Homework Help
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
Back
Top