1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov Chain Expected Value

  1. Sep 10, 2015 #1
    1. The problem statement, all variables and given/known data
    Markov process has probabilities [itex]p_{j,j+1} = 1-p_{j,0} = (\frac{j+1}{j+2})^k[/itex] for j=0,1,2,...
    If [itex]T_j = min[n>1 : X_n=j][/itex]
    What is [itex]E[T(j)|X_0=j][/itex] for j=0,1,2,...?

    2. Relevant equations

    3. The attempt at a solution
    I figured that [itex]T(j)|X_0=j[/itex] is j+1 but don't know how to work out the expectation of this
    Last edited: Sep 10, 2015
  2. jcsd
  3. Sep 10, 2015 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Your statement makes no sense: ##j+1## is an integer number, while ##T(j)|X_0 = j## is a random variable. What do you really mean, here?

    Also: get rid of that pesky "?", because the way you have written it, you are asking us whether or not you know how to do something.
  4. Sep 10, 2015 #3
    I think I don't quite understand the question. My interpretation of ##T(j)|X_0 = j## is that it is equal to the minimum number of steps needed to return to state ##j##. Looking at the transition probabilities I figured that the chain should immediately return to state 0 and then climb back up to state ##j## hence taking ##j+1## steps. My thinking might be entirely wrong though.
  5. Sep 10, 2015 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    It is entirely wrong.

    If ##X_0 = j##, where can the chain go next? It can go to state 0 or state j+1. If it had gone to 0 it can flip numerous times back and forth between state 0 and states 1,2, ..., j-1, then go to j in the next step after that. If it had gone to state j+1, it can next go to j+2 or to 0; once at 0 the behavior just described takes over. If at j+2 it next goes either to j+3 or down to 0, etc. etc.

    Getting the exact probability distribution of the first return-time to j is not easy, but there are (in your textbook, or notes, or on-line) some equations which, when solved, spit out the expected return time as part of their solution. I have not looked at the details in this case, so I don't know how easy it would be to solve the equations.

    However, there is a more serious issue: can we be sure that the chain does, indeed, return to state j with probability 1? If so, can we be sure the expected return time is finite? The point is that ##p_{j,j+1} \to 1 ## as ##j \to \infty##, so the farther out we are, the less chance there is of going back. There are test criteria available that allow you to deal with some such cases, but how easy they are to apply in this case is something I have not investigated.
  6. Sep 10, 2015 #5
    Thank you, I understand what the question is asking now. Would you happen to know what these equations might be called so that I can search them up.
  7. Sep 10, 2015 #6

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Are you telling me that your course notes and/or textbook does not have such material? I am surprised, since it is so very fundamental to the subject. However, you can Google "expected first-passage time" or "expected first-passage time + markov chain".
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted

Similar Discussions: Markov Chain Expected Value
  1. Markov chain (Replies: 0)

  2. Markov chains (Replies: 0)

  3. Markov Chain (Replies: 1)

  4. Markov chains (Replies: 0)

  5. Markov Chain (Replies: 1)