1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Construct a Markov Chain

  1. Oct 3, 2016 #1
    1. The problem statement, all variables and given/known data
    Describe the construction of a Markov chain X0, X1, ... on Ω ∈ (0, 1) with state space S = {1, 2, ..., s} and S X S PTM P and initial state X0 ~ ν (probabilities distributed like vector ν). Use the sequence U0, U1, ... to generate the Xn's

    2. Relevant equations
    U0, U1 is a sequence of iid random variables (numbers between 0 and 1)
    ν = (ν1, ν2, ..., νs)

    3. The attempt at a solution
    Step 1:
    let X0 = 1 if U0 ≤ v1
    let X0 = 2 if ν1< U0 ≤ v2...
    more generally let X0 = s if νs-1< U0 ≤ vs
    Set n = 1
    Step 2:
    If Xn-1 = 1 let Xn=1 if Un ≤ νP11...
    let Xn=s if νP1(s-1) < Un ≤ νP1s
    ... more generally
    If Xn-1 = s let Xn=1 if Un ≤ νPs1...
    let Xn=s if νPs(s-1) < Un ≤ νPss
    Step 3:
    Increase n by 1 and repeat 2)
    Please let me know if this makes sense.
     
  2. jcsd
  3. Oct 3, 2016 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Please clarify. You seem to be using the symbol ##v## in two distinct ways: (1) as a vector ##v = (v_1, v_2, \ldots, v_s)##; and (ii) as some mysterious, undefiined parameter in the conditions ##X_n = 1## if ##U_n \leq v P_{11}##, etc.
     
  4. Oct 3, 2016 #3
    The vector is meant to symbolise the initial state distribution (the probability of being in any state s in the beginning is is vi...vs). I am using the fact that any state in the future is the product of the initial state and a power of the PTM νP
     
  5. Oct 3, 2016 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Yes, I understand that ##vP## is a row-vector with components ##(vP)_j = \sum_i v_i P_{ij}, j=1,2,\ldots, s##. However, ##vP_{11}## is the row-vector ##(v_1 P_{11}, v_2 P_{11}, \ldots, v_s P_{11})##, and saying that the scalar ##U_n## be ##\leq## that vector makes no sense.
     
  6. Oct 3, 2016 #5
    How would it be better written? How would you have written the steps?
     
  7. Oct 4, 2016 #6

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    I don't know, since I have no idea what you are actually trying to say in those steps.

    Step 1 is OK, and is written in more-or-less standard form; it is the later steps that are mysterious.

    Suppose ##X_t=1## and we generate a uniform U(0,1) random number ##U##. In plain, understandable terms, how do we use ##U## to decide whether ##X_{t+1} = 1## or ##X_{t+1} = 2## ##\ldots## or ##X_{t+1} = s##?

    After figuring out what is needed, then worry about how to represent it in appropriate notation.
     
  8. Oct 4, 2016 #7
    Ok trying to modify step 2:
    Step 2:
    If Xn-1 = 1 let Xn = 1 if Un ≤ ν1P111
    ...more generally
    If Xn-1 = s let Xn = 1 if Un ≤ νsPsss
     
  9. Oct 4, 2016 #8
    Ok trying to modify step 2:
    Step 2:
    If Xn-1 = 1 let Xn = 1 if Un ≤ ν1P111
    let Xn=s if ν1P1(s-1) < Un ≤ ν1P1s
    ...more generally
    If Xn-1 = s let Xn = 1 if Un ≤ νsPss1
    let Xn = s if νsPs(s-1) < Un ≤ νsPsss
     
  10. Oct 4, 2016 #9

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Your construction fails to simulate a Markov chain. The fundamental property of a Markov process is that P{future | present & past} = P{future | present}.

    That is the "memoryless" property: given the present condition of the system, the past is irrelevant from now on. So at time t = 1 the past is wiped out: if you are at ##X_1 = 1## it does not matter how you arrived there, or how probable it was to arrive. You have arrived; you are there. So, the location of next state ##X_2## is governed by the probability mass function ##\{ P_{11}, P_{12}, \ldots P_{1s} \}##. Note that I am assuming you are using the common convention of having the rows of ##P## summing to 1 (rather than the columns, which is a much less common convention that I have also seen from some writers). So, you have a legitimate probability distribution, because ##P_{11} + P_{12} + \cdots + P_{1s} = 1##. At this point the values of ##v_1, v_2, \ldots, v_s## are no longer relevant; they just govern how likely it will be that you start off in some state. They have no relevance to what happens once you are, indeed, at some one of the states.
     
  11. Oct 4, 2016 #10
    Ok I think I see what you're saying. So a revised version would be:

    Step 2:
    If Xn-1 = 1 let Xn = 1 if Un ≤ P11...
    let Xn=s if P1(s-1) < Un ≤ P1s
    ...more generally
    If Xn-1 = s let Xn = 1 if Un ≤ Ps1
    ...let Xn = s if Ps(s-1) < Un ≤ Pss

    Please let me know if this takes into account the memoryless property or if it can be expressed in better notation
     
  12. Oct 4, 2016 #11

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    The notation is OK now, but you still leave too much to the imagination. You need to expand a bit on your statement that, when ##X_{n-1}=1## you have ##X_n = 1 ## if ##U_n \leq P_{11} \; \ldots##. What are a couple of additional conditions? I mean, suppose ##U_n > P_{11}##; what then? When can you get ##X_n = 2## or ##X_n = 3##, etc.?
     
  13. Oct 4, 2016 #12
    I thought I generalised them after the "..." but I can be more specific. So a revised version would be:

    Step 2:
    If Xn-1 = 1 let Xn = 1 if Un ≤ P11
    let Xn=2 if P11 < Un ≤ P12
    let Xn=3 if P12 < Un ≤ P13
    ...
    let Xn=s if P1(s-1) < Un ≤ P1s
    ...more generally
    If Xn-1 = s let Xn = 1 if Un ≤ Ps1
    let Xn = 2 if Ps1 < Un ≤ Ps2
    let Xn = 3 if Ps2 < Un ≤ Ps3
    ...let Xn = s if Ps(s-1) < Un ≤ Pss

    Please let me know if this is what you mean by more specific conditions.
     
  14. Oct 4, 2016 #13

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Yes, but the more specific conditions are incorrect. Google "generating discrete random variables" or something similar, to see what you must do.
     
  15. Oct 4, 2016 #14
    Ok after some more research I tried.

    Step 2:
    If Xn-1 = 1 let Xn = 1 if Un ≤ P11
    let Xn=2 if P11 < Un ≤ P11 + P12
    let Xn=3 if P11 + P12 < Un ≤ P11 + P12 + P13
    ...
    let Xn=s if P11 + P12 + ... + P1(s-1) < Un ≤ 1
    ...more generally
    If Xn-1 = s let Xn = 1 if Un ≤ Ps1
    let Xn = 2 if Ps1 < Un ≤ Ps1 + Ps2
    let Xn = 3 if Ps1 + Ps2 < Un ≤ Ps1 + Ps2 + Ps3
    ...
    let Xn = s if Ps1 + Ps2 + ... + Ps(s-1) < Un ≤ 1

    I think earlier my intervals didn't make sense since I didn't add up the probabilities. Does this make more sense now?
     
  16. Oct 4, 2016 #15

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Yes, it is OK now. But, looking back I see you made the same error in Step 1, so that needs fixing as well.
     
  17. Oct 4, 2016 #16
    Ok fixed it.

    Step 1:
    let X0 = 1 if U0 ≤ ν1
    let X0 = 2 if ν1 < U0 ≤ ν1 + ν2
    ...
    more generally let X0 = s if ν1 + ν2 + ... + νs-1 < U0 ≤ ν1 + ν2 + ... + νs-1 + νs
    set n = 1

    Step 2:
    If Xn-1 = 1 let Xn = 1 if Un ≤ P11
    let Xn=2 if P11 < Un ≤ P11 + P12
    let Xn=3 if P11 + P12 < Un ≤ P11 + P12 + P13
    ...
    let Xn=s if P11 + P12 + ... + P1(s-1) < Un ≤ P11 + P12 + ... + P1(s-1) + P1s
    ...more generally
    If Xn-1 = s let Xn = 1 if Un ≤ Ps1
    let Xn = 2 if Ps1 < Un ≤ Ps1 + Ps2
    let Xn = 3 if Ps1 + Ps2 < Un ≤ Ps1 + Ps2 + Ps3
    ...
    let Xn = s if Ps1 + Ps2 + ... + Ps(s-1) < Un ≤ Ps1 + Ps2 + ... + Ps(s-1) + Pss

    Step 3:
    increase n by 1 and return to 2)
     
  18. Oct 4, 2016 #17

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Looks good now.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Construct a Markov Chain
  1. Markov chain (Replies: 0)

  2. Markov chains (Replies: 0)

  3. Markov Chain (Replies: 1)

  4. Markov chains (Replies: 0)

  5. Markov Chain (Replies: 1)

Loading...