Probability of a Stochastic Markov process

In summary, the conversation discusses the definition of a stationary process in the context of studying stochastic processes. The book "Handbook of stochastic processes - Gardiner" defines a stationary process as having equal statistics at different time intervals. The book also mentions that for a stationary and Markovian process, only conditional probabilities need to be known to determine joint probabilities. The question raised is whether the probability should be written as a product of conditional probabilities or with translated time intervals. The justification for using translated time intervals is also discussed, as well as an example scenario.
  • #1
Aslet
20
1
Hi everyone! I'm approaching the physics of stochastic processes. In particular I am studying from "Handbook of stochastic processes - Gardiner". This book defines a stationary process like:
$$ p(x_1, t_1; x_2, t_2; ...; x_n, t_n) = p(x_1, t_1 + \epsilon; x_2, t_2 + \epsilon; ...; x_n, t_n + \epsilon) $$
and this means that the statistics of ## X(t) ## is equal to that of ## X(t + \epsilon) ##. Hence the probabilities are only function of ## t_i - t_j ##.
Then the book says the if the process is also Markovian, the only things I need to know are the conditionale probabilities like:
$$ p_s(x_1, t_1 - t_2| x_2, 0) $$
because all joint probabilities can be written as a product of conditional probabilities.
Here comes my question. Is it hence correct for a stationary stochastic Markov process to write for 3 values of ## X(t) ##, for instance:
$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - (t_2 - t_3)| x_2, t_2 - t_3 ) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ $$
or should the probability be written as:
$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - t_2| x_2, 0) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ ?$$
To me the first equation is more understandable.
 
Physics news on Phys.org
  • #2
Aslet said:
Hi everyone! I'm approaching the physics of stochastic processes. In particular I am studying from "Handbook of stochastic processes - Gardiner".

(The second edition has a lot of corrections to the first edition and I think there is a 3rd edition.)

Is it hence correct for a stationary stochastic Markov process to write for 3 values of ## X(t) ##, for instance:
$$ p_s(x_1, t_1; x_2, t_2; x_3, t_3) = p_s(x_1, t_1 - (t_2 - t_3)| x_2, t_2 - t_3 ) \ p_s(x_2, t_2 - t_3| x_3, 0) \ p_s(x_3) \ $$

What justifies the factor ##p_s(x_1, t_1 - (t_2- t_3)| x_2,t_2 - t_3)##?

If we seek some quantity equivalent to ##p_s(x_1,t_1 | x_2,t_2)## we can translate by ##-t_2## giving ##p_s(x_1, t_1-t_2| x_2,0)##.

Suppose ##t_1 = 100, t_2 = 20, t_3 = 3##. Can you justify saying ##p_s(x_1, 100| x_2,20) = p_s(x_1, 83| x_2,17) ## ?
 

What is a stochastic Markov process?

A stochastic Markov process is a mathematical model that predicts the future state of a system based on its current state and the probabilities of transitioning to other states. It is a random process that follows the Markov property, which states that the future state of the system only depends on its current state, not on the previous states.

What is the difference between a stochastic Markov process and a deterministic Markov process?

The main difference between a stochastic and a deterministic Markov process is that in a stochastic process, the transitions between states are random and follow a probability distribution, while in a deterministic process, the transitions are known and fixed. This means that the future state of a stochastic process is uncertain, while the future state of a deterministic process can be accurately predicted.

What is the importance of the probability in a stochastic Markov process?

The probability in a stochastic Markov process represents the likelihood of a system transitioning from one state to another. It is a crucial factor in predicting the future state of the system and can be influenced by various factors such as external events or the current state of the system. The accuracy of the probability values is essential in accurately modeling and predicting the behavior of the system.

How is the probability of a stochastic Markov process calculated?

The probability of a stochastic Markov process is calculated using a transition matrix, which represents the probabilities of transitioning from one state to another. The matrix is multiplied by a vector representing the current state of the system, and the resulting vector represents the probabilities of the system being in each state in the next time step. This process can be repeated for multiple time steps to predict the future states of the system.

What are some real-world applications of stochastic Markov processes?

Stochastic Markov processes have various applications, including finance, economics, biology, and engineering. In finance, they are used to model stock prices and market trends. In biology, they can be used to model population growth and disease spread. In engineering, they are used to model reliability and maintenance of systems. They are also commonly used in weather forecasting and speech recognition.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Introductory Physics Homework Help
Replies
1
Views
741
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
27
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
54
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Classical Physics
Replies
0
Views
106
Back
Top