1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Markov Chain Statistics

  1. Apr 6, 2012 #1
    1. The problem statement, all variables and given/known data
    A taxicab moves between the airport, Hotel A, and Hotel B according to a Markov chain with transition probabilities:
    P(airport → A) = 0.7,
    P(airport → B) = 0.3,
    P(A → airport) = 0.9,
    P(A → B) = 0.1,
    P(B → airport) = 0.8,
    P(B → A) = 0.2.
    A-If the taxicab starts at the airport, what is the probability that it will be at Hotel A two moves later?
    B-Suppose the taxicab starts at the airport with probability 0.6 and starts at Hotel A and Hotel B with probability 0.2 each. What is the probability that it will be at Hotel B two moves later?
    C- In the long run, what fraction of visits will the taxicab make to each of the three locations?


    2. Relevant equations



    3. The attempt at a solution
    I have gotten the answer for parts A & C but I don't understand at all how I would set up the matrix with part B. My initial though was that the matrix was

    0 .2 .8
    .2 0 .8
    .6 .4 0

    with the first row/column being hotel A, the second row/column being hotel B, and the third row/column being the airport

    then by squaring the matrix I got
    .52 .32 .16
    .48 .36 .16
    .08 .12 .8

    and then to get B I added up .32 + .12=.44 which was wrong. What did I do wrong?
     
  2. jcsd
  3. Apr 7, 2012 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    For (b): in two moves, P{at B} = P{at B|start at port}*P{start at port} + P{at B|start at A}*P{start at A} + P{at B|start at B}*P{start at B}. The conditional probabilities are given by P*P.

    RGV
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Markov Chain Statistics
  1. Markov chains (Replies: 0)

  2. Markov Chain (Replies: 1)

  3. Markov chains (Replies: 0)

  4. Markov Chain (Replies: 1)

Loading...