How Can Markov Chains Model State Changes in Probability Theory?

  • Thread starter pamparana
  • Start date
  • Tags
    Probability
In summary: Pij+...+ where the dots represent the terms you need to figure out. The first term is the number of ways to get from your starting state to any intermediate state (like 1 in two steps). The second term is the number of ways to get from that state to the final state (like 2 after 2 steps). In general, there are *exactly* two terms for each intermediate state. Then you just have to be careful to multiply by the transition probabilities in the right order. In the above example there were two intermediate states, 0 and 1. In the example below, there are three intermediate states, 0, 1, and 2.P03=(P01)(P12)(
  • #1
pamparana
128
0
Hello everyone,

I am trying to figure out some overall possibilities of some states for a certain configuration and need a bit of help with probability theory.

So, assume that there is an object that can take 3 states {0, 1, 2} and the object can only change the state by doing one jump at a time. So, from 0 to 1 and then from 1 to 2 (I think it is called the Ising model)...

Initially the object is in state 0 and an event happens and the object can either stay in state 0 or move to state 1.

So, we have P_1(0) and P_1(1) and their sum is 1.

Now, consider the following scenario: object stayed in state (0). We have another event:

Again, we have a new set of probabilities
P_2(0) and P_2(1) again their sum is 1.

Now, after these two events, what is the overall probability that the object stays in state 0 or state 1?

Now, consider another scenario: object jumps to state (1) after the first event. Now, in the second event, we have probabilities

P_2(1) and P_2(2). So, it can either stay in state 1 or move to state 2.

So, overall what should be the probability of it being in state (0), (1) or (2)... is there a simple way to think about this that can encapsulate all these scenarios.

The background is that I am trying to code a small program that can track all the probabilities of these state jumps for this object and the problem is that the states can be more than 3...more like 20! And I want to figure out an algorithm and, of course, also understand how this probabilities evolve..

Would appreciate any help you can give me.

Thanks,
Luca
 
Physics news on Phys.org
  • #2
so is this kind of like an electron model where the electron can only be in one state at a time of a number of possible states with the caveat that it can only go up or down one state?
 
  • #3
Yes, I guess it is. However, I must confess that I am not a physicist and I am barely know high school level math. I recently started learning stuff on my own and have been toying around with some discrete optimization model and what I am trying to do is quantify sort of confidence values on estimated solutions from optimization algorithms.

Cheers,
Luca
 
  • #4
so you're in a given state n and you have three choices:

1) move up one state
2) stay where you are
3) move down one state

with 33% chance of doing one of these three actions, right?

and when you're in state 0 or state 19 (ie the twentieth state) you only have
two choices:

- stay where you are
- move to a neighboring state
 
  • #5
Hi there,

Yes, I actually do not even go down one state. So, I just either stay in that state or go up. However, they are not equally likely, so not 50-50.

So, after event 1, the probability of going from state 0 to state 1 could be low. However, after event 2, the same probability could be high.

So, after 2 events, what is the probability of seeing the object in state(0), state (1) or state (2)?

Thanks,
Luca
 
  • #6
given 50/50 probability of moving or staying in a given state I get
3x in state 0 and
2x in state 1 and
1x in state 2 a

nd normalizing things I get
3:6 in state 0 for 50%,
2:6 in state 1 for 33% and
1:6 in state 2 for 16.6%

my state change diagram

0
| \
0 1
| \ \
0 1 2
 
  • #7
Hey pamparana.

What you are describing is a markov chain. There is a very well developed theory for markov processes that let's you find among other things, steady state probabilities which tell you in the long run, what the probability is of ending up in a particular state.

How much background do you have in mathematics and probability/statistics?
 
  • #8
Hello guys,

Thanks for replying. I do not have much background in any of these things unfortunately. But I am willing to learn!

So basically, what I was looking for was a sort of a general formula/algorithm for this that can be efficiently coded up. I will look into this in a bit more detail now that you mention it might be a Markov Chain.

Thanks,
Luca
 
  • #9
This is a Markov chain. The transition probabilities are described in terms of the transition probability matrix. In your case the matrix is 2x2 with only two non-zero entries. The entries are the probabilities of staying in the current state or moving to the next state in a single step. For multiple steps, the Chapman-Kolmogorov equations give the probability of finding yourself in a given state after n steps, by what is essentially matrix multiplication. The only trick here is, if I understand your question, is that the probability matrix is different for each current state but that poses no real problem. Try "Probability Models" by Sheldon Ross.
 
  • #10
That is great. Thanks for the advise :)

Luca
 
  • #11
You are welcome. Maybe I can clarify with a simple example. Let Pij equal the probability of transitioning from state i to state j. So P01=probability of transitioning from 0 to 1. So suppose P00=0.6 and P01=0.4. Note that the sum is necessarily 1 because you only have 2 choices. Now after 2 steps you can be in any of 3 states, namely 0, 1, or 2. Let's suppose P11=0.75 and P12=0.25 since you said the transition probabilities can change from state to state.

So after 2 steps:

P02=(P01)(P12)=0.1 because there is only one way to get from 0 to 2 in two steps, namely step on each.

P01=(P00)(P01)+(P01)(P11)=(0.6)(0.4)+(0.4)(0.75)=0.54 because there are two ways to get from state 0 to state 1 in two steps, both including staying in one state on one step.

P00=(0.6)(0.6)=0.36 because you stay where you are twice.

Notice that 0.1+0.54+0.36=1 because you must be in state 0,1,or 2 after 2 steps.

Now after 3 steps you could be in any of the states 0, 1, 2, or 3.

There is a quick visual trick for figuring out how many terms you must have and what they are. Suppose + means that you step and - means that you don't. Then to move k steps in n time steps you need to know the number of ways to take k in n. For example, I want to find P02 after 3 time steps. That means that I need ++-, +-+, or -++ to get 2 steps in 3 time steps. (Usually the number of steps is a superscript, the positions ij are subscripts).

From my little pictures I know that the transition probability P02 after 3 time steps contains 3 terms, one for each picture. So

P02=(P01)(P12)(P22)+(P01)(P11)(P12)+(P00)(P01)(P12).

Each term in the sum is represented by one of my ++- pictures involving moving twice nd staying once in 3 time steps.

Try it for larger. You can check yourself by adding probabilities. For example, after 3 time steps, you can be in any of states 0,1,2, or 3. So the sum of these 4 probabilities, P00, P01, P02, P03 must equal 1. Using pictures, P00=(---), P01=(+--)+(-+-)+(--+),
P02=(++-)+(+-+)+(-++), and P03=(+++).

Hope this helps. Let me know if you have questions.
 
  • #12
You are welcome. Maybe I can clarify with a simple example. Let Pij equal the probability of transitioning from state i to state j. So P01=probability of transitioning from 0 to 1. So suppose P00=0.6 and P01=0.4. Note that the sum is necessarily 1 because you only have 2 choices. Now after 2 steps you can be in any of 3 states, namely 0, 1, or 2. Let's suppose P11=0.75 and P12=0.25 since you said the transition probabilities can change from state to state.

So after 2 steps:

P02=(P01)(P12)=0.1 because there is only one way to get from 0 to 2 in two steps, namely step on each.

P01=(P00)(P01)+(P01)(P11)=(0.6)(0.4)+(0.4)(0.75)=0.54 because there are two ways to get from state 0 to state 1 in two steps, both including staying in one state on one step.

P00=(0.6)(0.6)=0.36 because you stay where you are twice.

Notice that 0.1+0.54+0.36=1 because you must be in state 0,1,or 2 after 2 steps.

Now after 3 steps you could be in any of the states 0, 1, 2, or 3.

There is a quick visual trick for figuring out how many terms you must have and what they are. Suppose + means that you step and - means that you don't. Then to move k steps in n time steps you need to know the number of ways to take k in n. For example, I want to find P02 after 3 time steps. That means that I need ++-, +-+, or -++ to get 2 steps in 3 time steps. (Usually the number of steps is a superscript, the positions ij are subscripts).

From my little pictures I know that the transition probability P02 after 3 time steps contains 3 terms, one for each picture. So

P02=(P01)(P12)(P22)+(P01)(P11)(P12)+(P00)(P01)(P12).

Each term in the sum is represented by one of my ++- pictures involving moving twice nd staying once in 3 time steps.

Try it for larger. You can check yourself by adding probabilities. For example, after 3 time steps, you can be in any of states 0,1,2, or 3. So the sum of these 4 probabilities, P00, P01, P02, P03 must equal 1. Using pictures, P00=(---), P01=(+--)+(-+-)+(--+),
P02=(++-)+(+-+)+(-++), and P03=(+++).

Hope this helps. Let me know if you have questions.
 
  • #13
Thank you so much for this detailed explanation. it is so much clear to me now. So, it is the sum of the product of probabilities on each path.

Just to clarify one thing: there is no need to normalize these probabilities at any point:

So,

P(0, 1) = (P00)(P01)+(P01)(P11). There is no need to divide this by 2 or the sum of the individual probabilities.

Thank you very much.

Luca
 
  • #14
You are welcome. No you don't have any normalization because each path is different and we overcount nothing. I should note that the number of path gets big really fast but if you track these indices you will find that you can do the whole thing numerically using matrix multiplication.
 
  • #15
Thank you. Yes, I have been reading up on these Markov chains and have a good idea on how to implement it in the computer.

Thanks again everybody :) Much appreciated. You are all awesome :)

Luca
 

1. What is probability?

Probability is a measure of the likelihood of an event occurring. It is expressed as a number between 0 and 1, where 0 indicates impossibility and 1 indicates certainty.

2. How is probability calculated?

The probability of an event is calculated by dividing the number of favorable outcomes by the total number of possible outcomes.

3. What is the difference between theoretical and experimental probability?

Theoretical probability is based on mathematical calculations and assumes that all outcomes are equally likely to occur. Experimental probability is based on actual observations and can vary depending on the number of trials.

4. How can probability be used in everyday life?

Probability can be used to make predictions and informed decisions. For example, it can be used to determine the chances of winning a game or the likelihood of a certain event happening.

5. What are some common misconceptions about probability?

One common misconception is that if an event has not occurred in a while, it is "due" to happen. In reality, the probability of the event remains the same regardless of past outcomes. Another misconception is that probabilities can be added or subtracted, when in fact they must be multiplied or divided depending on the situation.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
638
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
57
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
970
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
813
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
849
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
Back
Top