1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov Chains

  1. May 6, 2009 #1
    1. The problem statement, all variables and given/known data

    In a lab experiment, a mouse can choose one of two food types each day, type I and type II. Records show that if a mouse chooses type I on a given day, then there is a 75% chance that it will choose type I the next day and if it chooses type II on one day, then there is a 50% chance that it will choose type II the next day.

    (a) If the mouse chooses type I today, what is the probability that it will choose type I two days from now?

    (b) If the mouse chooses type II today, what is the probability that it will choose type II three days from now?


    2. Relevant equations


    3. The attempt at a solution

    I think a suitable transition matrix for this phenomenon is:

    [tex]Px_{t} = \left[\begin{array}{ccccc} 0.25&0.5 \\ 0.75&0.5 \end{array}\right][/tex] [tex]\left[\begin{array}{ccccc} x_{1}(t) \\ x_{2}(t) \end{array}\right][/tex]

    for part (a) I have the initial condition [tex]\left[\begin{array}{ccccc} 1 \\ 0 \end{array}\right][/tex]

    [tex]\left[\begin{array}{ccccc} 0.25&0.5 \\ 0.75&0.5 \end{array}\right][/tex] [tex]\left[\begin{array}{ccccc} 2 \\ 0 \end{array}\right][/tex][tex]= \left[\begin{array}{ccccc} 0.5 \\ 1.5 \end{array}\right][/tex]

    So the probability is 0.5?

    for part (b) the initial condition is (0,1). This time we end up with:

    [tex]= \left[\begin{array}{ccccc} 1.5 \\ 2.5 \end{array}\right][/tex] !!

    The probability of choosing type II in three days is 2.5 :confused:
     
  2. jcsd
  3. May 6, 2009 #2
    And btw last part of the questions asks:

    If there is 10% chance that the mouse will choose type I today, what is the probability that it will choose type I tomorrow?

    I'm not sure how to use my matrix to solve find this.
    I appreciate some guidance. Thanks :)
     
  4. May 6, 2009 #3

    Borek

    User Avatar

    Staff: Mentor

    Isn't your matrix transposed?
     
  5. May 6, 2009 #4
    No, which matrix?

    [tex]\left[\begin{array}{ccccc} x_{1}(t) \\ x_{2}(t) \end{array}\right][/tex] is the state vector.
     
  6. May 6, 2009 #5

    Mark44

    Staff: Mentor

    I think Borek meant your transition matrix.
     
  7. May 6, 2009 #6

    Borek

    User Avatar

    Staff: Mentor

    I think that's what I thought. Rows should sum to 1.
     
  8. May 6, 2009 #7
    I'm looking at an example in my text book and only columns sum to 1 not rows.
     
  9. May 6, 2009 #8

    Borek

    User Avatar

    Staff: Mentor

    So perhaps you should use a row vector for a state vector? That's a matter of convention.

    Sum of probablities should be 1, so both your state vectors (for a and b) are wrong.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Markov Chains
  1. Chain Rules (Replies: 3)

  2. The chain rule (Replies: 7)

Loading...