When are Markov processes non-reversible?

1. Feb 23, 2014

carllacan

I had to solve a problem in which I had a Markov process described by a transition matrix with elements $a _{ij} = P(X_t =x_j | X_{t-1} = x_i)$, where $X_t$ is the state at time t and $x_n$ are the possible states of the system.

I was asked to find, given the state of the system at a time t, the probability of the system having been in a certain state at time t-1. Using the Bayes Rule I found how to write a matrix $B$ that represents this reversed process:
$b_{ij} = P(X_{t-1} = x_i| X_t =x_j) =\frac{P(X_t =x_j | X_{t-1} = x_i) P(X_{t-1} = x_i)}{(P(X_t =x_j )} = \frac{a_{ji} \pi _j}{\pi _i}$, where $\vec{\pi}$ is the stationary distribution vector.

Trying to check my result I searched on google for "reversed markov processes" and I found that $a_{ij}\pi_{i} = a_{ji} \pi _j$ it the condition for a Markov process being reversible, and agrees with my solution when A = B. Should I interpret this as that the mean that a Markov process is reversible only if the matrix describing it is the same that describes the reversed process?