Probability of a Stochastic Markov process

Click For Summary
SUMMARY

The discussion centers on the properties of stationary stochastic Markov processes as defined in the "Handbook of Stochastic Processes" by Gardiner. It establishes that for a stationary process, the joint probability can be expressed in terms of conditional probabilities. The participants debate the correct formulation of the joint probability for three values of X(t), specifically whether to use the expression involving time differences or the one that simplifies to a conditional format. The consensus leans towards the latter, emphasizing the importance of understanding the time shifts in relation to the Markov property.

PREREQUISITES
  • Understanding of stochastic processes and their properties
  • Familiarity with Markov processes and conditional probabilities
  • Knowledge of probability theory and statistical mechanics
  • Experience with mathematical notation and expressions in physics
NEXT STEPS
  • Study the "Handbook of Stochastic Processes" by Gardiner for in-depth knowledge
  • Explore the concept of conditional probabilities in Markov processes
  • Learn about the implications of stationarity in stochastic processes
  • Investigate practical applications of stochastic processes in physics and engineering
USEFUL FOR

Researchers, physicists, and students specializing in stochastic processes, probability theory, and statistical mechanics will benefit from this discussion.

Aslet
Messages
20
Reaction score
1
Hi everyone! I'm approaching the physics of stochastic processes. In particular I am studying from "Handbook of stochastic processes - Gardiner". This book defines a stationary process like:
$$ p(x_1, t_1; x_2, t_2; ...; x_n, t_n) = p(x_1, t_1 + \epsilon; x_2, t_2 + \epsilon; ...; x_n, t_n + \epsilon) $$
and this means that the statistics of ## X(t) ## is equal to that of ## X(t + \epsilon) ##. Hence the probabilities are only function of ## t_i - t_j ##.
Then the book says the if the process is also Markovian, the only things I need to know are the conditionale probabilities like:
$$ p_s(x_1, t_1 - t_2| x_2, 0) $$
because all joint probabilities can be written as a product of conditional probabilities.
Here comes my question. Is it hence correct for a stationary stochastic Markov process to write for 3 values of ## X(t) ##, for instance:
$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - (t_2 - t_3)| x_2, t_2 - t_3 ) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ $$
or should the probability be written as:
$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - t_2| x_2, 0) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ ?$$
To me the first equation is more understandable.
 
Physics news on Phys.org
Aslet said:
Hi everyone! I'm approaching the physics of stochastic processes. In particular I am studying from "Handbook of stochastic processes - Gardiner".

(The second edition has a lot of corrections to the first edition and I think there is a 3rd edition.)

Is it hence correct for a stationary stochastic Markov process to write for 3 values of ## X(t) ##, for instance:
$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - (t_2 - t_3)| x_2, t_2 - t_3 ) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ $$

What justifies the factor ##p_s(x_1, t_1 - (t_2- t_3)| x_2,t_2 - t_3)##?

If we seek some quantity equivalent to ##p_s(x_1,t_1 | x_2,t_2)## we can translate by ##-t_2## giving ##p_s(x_1, t_1-t_2| x_2,0)##.

Suppose ##t_1 = 100, t_2 = 20, t_3 = 3##. Can you justify saying ##p_s(x_1, 100| x_2,20) = p_s(x_1, 83| x_2,17) ## ?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 27 ·
Replies
27
Views
4K