How can I compute expected return time of a state in a Markov Chain?

In summary, the conversation discusses the calculation of expected return time for a Markov Chain. The equation ##m_{12}=1+p_{11}m_{12}## is derived by taking into account the probability of not leaving node 1 and the mean time to reach node 2 from 1. The speaker also mentions the possibility of calculating ##m_{11}## in two different ways and the importance of checking for consistency in the calculations. They also clarify the meaning of the formula for expected number of steps from node i to j.
  • #1
user366312
Gold Member
89
3
Problem Statement
I was watching a YouTube video regarding the calculation of expected return time of a Markov Chain.
I haven't understood the calculation of ##m_{12}##.

How could he write ##m_{12}=1+p_{11}m_{12}##? I have given a screenshot of the video.
241819
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
In all cases, you need to take a time step - that is the 1. With probability ##p_{11}##, you will not leave node 1. So you need to add that probability multiplied by the mean time to get to 2 from 1, which is ##m_{12}##.
 
  • #3
Orodruin said:
In all cases, you need to take a time step - that is the 1. With probability ##p_{11}##, you will not leave node 1. So you need to add that probability multiplied by the mean time to get to 2 from 1, which is ##m_{12}##.

Why didn't he add 1 in case of ##m_{11}##?

Why is ##m_{12}## on the both side of the equation
 
  • #4
user366312 said:
Why didn't he add 1 in case of m11m11m_{11}?
He did.

user366312 said:
Why is m12m12m_{12} on the both side of the equation
Because if you stay at 1, then the expected time to get to 2 will be the expected time to get to 2 from 1.
 
  • #5
Orodruin said:
He did.

I don't see it. Coz, he calculated ##m_{11} = \frac{1}{\frac{2}{3}}##

Orodruin said:
Because if you stay at 1, then the expected time to get to 2 will be the expected time to get to 2 from 1.

So, then he should write: ##m_{12} = 1 + p_{11}m_{21}## . But he didn't write that.
 
  • #6
user366312 said:
I don't see it. Coz, he calculated ##m_{11} = \frac{1}{\frac{2}{3}}##
So, then he should write: ##m_{12} = 1 + p_{11}m_{21}## . But he didn't write that.

Well, it is good he did not write that, because it is wrong.

Just look at the equations
$$m_{ij} = 1 + \sum_{k \neq j} p_{ik} m_{kj}.$$
You have all the ##p_{ij}## given right in the problem, so just going ahead and writing out the equations for the ##m_{ij}## is elementary. You seem to be over-thinking the problem, or in some other way, confusing yourself.

You should realize that there may be more than one way to compute some of the ##m_{ij}##. Using ##m_{11} = 1 + p_{12} m_{21}## is one way (after you have calculated ##m_{21}##), but using ##m_{11} = 1/\pi_1## is another. A useful calculation check is to see whether you get the same value both ways---if you did everything right, you should.
 
Last edited:
  • Like
Likes jim mcnamara
  • #7
Thank you @Ray Vickson! That concludes the answer very well.
 
  • #8
The meaning of the

$$ m_{i,j} = 1+ \sum_{k\neq j} p_{i,k} m_{k,j} $$
formula.

The expected number of steps to get someone from i to j:

One step is definitely a must.

However, besides, if you did not reach j in the first step, but you get into a different k, then look at how many steps you can expect from k to j and you take this expected value with the weight what probability you first step into k.
 

1. How do I calculate the expected return time of a state in a Markov Chain?

To calculate the expected return time of a state in a Markov Chain, you need to first determine the transition probabilities for each state. Then, you can use the formula:
Expected return time = 1/Transition probability
This will give you the average number of steps it takes for the system to return to the original state.

2. Can I use any formula to compute the expected return time of a state in a Markov Chain?

No, the formula for calculating the expected return time of a state in a Markov Chain only applies when the system is in a steady state. If the system is not in a steady state, the formula will not give an accurate result.

3. How can I determine the transition probabilities for each state in a Markov Chain?

The transition probabilities can be determined by analyzing the transition matrix for the Markov Chain. This matrix shows the probability of transitioning from one state to another. You can also use historical data or experimental data to estimate the transition probabilities.

4. Is the expected return time of a state in a Markov Chain always a whole number?

No, the expected return time of a state in a Markov Chain can be a decimal or fraction. This is because the transition probabilities can be any number between 0 and 1, and the expected return time is calculated by taking the inverse of the transition probability.

5. Can I use the expected return time to predict the behavior of a Markov Chain?

The expected return time can give you an idea of the average time it takes for the system to return to a specific state, but it cannot predict the exact behavior of the Markov Chain. Other factors such as initial state, transition probabilities, and external influences can also impact the behavior of the system.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
5K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
12
Views
949
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
Replies
9
Views
2K
Back
Top