- #1

- 20

- 1

$$ p(x_1, t_1; x_2, t_2; ...; x_n, t_n) = p(x_1, t_1 + \epsilon; x_2, t_2 + \epsilon; ...; x_n, t_n + \epsilon) $$

and this means that the statistics of ## X(t) ## is equal to that of ## X(t + \epsilon) ##. Hence the probabilities are only function of ## t_i - t_j ##.

Then the book says the if the process is also Markovian, the only things I need to know are the conditionale probabilities like:

$$ p_s(x_1, t_1 - t_2| x_2, 0) $$

because all joint probabilities can be written as a product of conditional probabilities.

Here comes my question. Is it hence correct for a stationary stochastic Markov process to write for 3 values of ## X(t) ##, for instance:

$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - (t_2 - t_3)| x_2, t_2 - t_3 ) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ $$

or should the probability be written as:

$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - t_2| x_2, 0) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ ?$$

To me the first equation is more understandable.