Understanding Markov Processes: Steven's Questions

  • Context: Graduate 
  • Thread starter Thread starter steven187
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around understanding Markov processes, particularly in relation to stochastic processes and their graphical representation. Participants explore the differences between Markov and Gaussian processes, the role of initial distributions, transition probabilities, and the implications of time in these processes.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Steven expresses confusion about how Markov processes relate to stochastic processes and seeks a graphical understanding of the family of random variables involved.
  • He questions the necessity of knowing the distribution at a specific time if only one event can occur at that moment.
  • In a follow-up, Steven suggests that while the family of random variables is similar in both Markov and Gaussian processes, the calculation of densities differs, emphasizing the importance of transition probabilities.
  • Another participant explains the method for computing the probability of a Markov chain being in a given state after one step, detailing the use of initial states and transition probabilities.
  • There is a discussion about how to determine the initial distribution, with one participant noting that it can depend on what is being modeled and that sometimes the initial distribution may not significantly affect long-term probabilities.

Areas of Agreement / Disagreement

Participants express varying levels of understanding regarding the initial distribution and its calculation. While some agree on the method of determining probabilities, there is no consensus on how to realistically establish initial distributions or the implications of time on the processes.

Contextual Notes

Limitations include the potential subjectivity in determining initial distributions and the assumptions required for modeling Markov processes accurately. The discussion does not resolve how to realistically apply these concepts in practice.

steven187
Messages
176
Reaction score
0
Hi all,

Im currently researching into stochastic proesses. The gaussian process wasnt hard to tackle, However, I don't understand the Markov process. See I understand that a stochastic process is a family of random variable's which is dependent and distinguished upon another variable. But with a Markov process I can't see where this family of random variables like the way I see it in the Gaussian process. How could I understand this graphically?

I also realize that we need to sets of information, is it the initial distribution or the initial point and the transition probability?

Another thing I don't understand is if these stochastic processes are related to time how are we suppose to know the distribution at a particular point in time if only one thing can occur at a particular point in time?

Please help,

Regards

Steven
 
Physics news on Phys.org
Hi all,

To answer my own question, pretty much the family of random variables is the same in both Markov and Gaussian Process what's different is how the densities are calculated, Especially the Markov process its quit remarkable. In terms of understanding it graphically, we have an initial distribution then after 1 time step we change from one state to another, and its the probability of this change of state which is used to find the density function of such a process.

And yes it aint an initial point its an initial distribution.

Please correct me if I am wrong?

However I still don't understand how we are are suppose to know the distribution at a particular point in time if only one thing can occur a a particular point in time? I believe that this stochastic processes are not that realistic as I find gaining knowledge of the distribution at each point in time impossible unless we make a number of assumptions.

Regards

Steven
 
To compute the probability of the Markov chain being in any given state i after one step we multiply the probability of being in each initial state j by the probability of going from j to i and add this value to the probabilities of getting to i starting from different initial states.

That is,
Pr(X1=i|X0 has initial distribution)=P(1)P(1,i)+P(2)P(2,i)+...+P(n)P(n,i)

Where P(n) is the probability of starting in state n and P(n,i) is the probability of going from state n to state i.

The same equation can also be written as:

Pr(X1=i|X0 has initial distribution d)=P*d

Where P is the matrix of transition probabilities, (the matrix with i,jth element equal to P(i,j)) and D is the initial distribution written as a column vector (the kth element of D is the probability of starting in state k).

So, to find the distribution after k steps we just apply the same procedure k times. That is, we find the distribution after 1 step, then use it as the initial distribution to find the distribution after 2 steps and so on.

Now we can calculate the probability of a Markov chain being in a state i after n steps as follows:

Pr(Xk=i|X0 has initial distribution d)=(P^k)*d

Does this answer your question?
 
Hi there,

Thanxs for your response, it makes a lot more sense now, it seems like a simple probability problem except its a lot bigger, I now get how we get the distribution function for each step, however in terms of the initial distribution function, how do we work out such distributions? I mean to be realistic and actually apply this process we would need to know how to figure out the initial distribution? is there a way or is it subjective?
 
It depends what you are trying to model. Often you know the state the process will start in. In which case you'd use the initial distribution vector that has a 1 as the element corresponding to that state and 0s everywhere else.

Also, a certain class of Markov processes turns out to have a long term probability distribution (that's the limit of the distribution as the number of steps goes to infinity) that is independent of the initial distribution you use. So, depending on what you are doing, the initial distribution might not matter a whole lot.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K