Markov Chains

  • Thread starter roam
  • Start date
  • #1
1,271
12

Homework Statement



In a lab experiment, a mouse can choose one of two food types each day, type I and type II. Records show that if a mouse chooses type I on a given day, then there is a 75% chance that it will choose type I the next day and if it chooses type II on one day, then there is a 50% chance that it will choose type II the next day.

(a) If the mouse chooses type I today, what is the probability that it will choose type I two days from now?

(b) If the mouse chooses type II today, what is the probability that it will choose type II three days from now?


Homework Equations




The Attempt at a Solution



I think a suitable transition matrix for this phenomenon is:

[tex]Px_{t} = \left[\begin{array}{ccccc} 0.25&0.5 \\ 0.75&0.5 \end{array}\right][/tex] [tex]\left[\begin{array}{ccccc} x_{1}(t) \\ x_{2}(t) \end{array}\right][/tex]

for part (a) I have the initial condition [tex]\left[\begin{array}{ccccc} 1 \\ 0 \end{array}\right][/tex]

[tex]\left[\begin{array}{ccccc} 0.25&0.5 \\ 0.75&0.5 \end{array}\right][/tex] [tex]\left[\begin{array}{ccccc} 2 \\ 0 \end{array}\right][/tex][tex]= \left[\begin{array}{ccccc} 0.5 \\ 1.5 \end{array}\right][/tex]

So the probability is 0.5?

for part (b) the initial condition is (0,1). This time we end up with:

[tex]= \left[\begin{array}{ccccc} 1.5 \\ 2.5 \end{array}\right][/tex] !!

The probability of choosing type II in three days is 2.5 :confused:
 

Answers and Replies

  • #2
1,271
12
And btw last part of the questions asks:

If there is 10% chance that the mouse will choose type I today, what is the probability that it will choose type I tomorrow?

I'm not sure how to use my matrix to solve find this.
I appreciate some guidance. Thanks :)
 
  • #3
Borek
Mentor
28,961
3,565
Isn't your matrix transposed?
 
  • #4
1,271
12
Isn't your matrix transposed?

No, which matrix?

[tex]\left[\begin{array}{ccccc} x_{1}(t) \\ x_{2}(t) \end{array}\right][/tex] is the state vector.
 
  • #6
Borek
Mentor
28,961
3,565
I think that's what I thought. Rows should sum to 1.
 
  • #7
1,271
12
I'm looking at an example in my text book and only columns sum to 1 not rows.
 
  • #8
Borek
Mentor
28,961
3,565
So perhaps you should use a row vector for a state vector? That's a matter of convention.

Sum of probablities should be 1, so both your state vectors (for a and b) are wrong.
 

Related Threads on Markov Chains

Replies
3
Views
2K
Replies
1
Views
1K
Replies
4
Views
762
  • Last Post
Replies
2
Views
847
Replies
4
Views
139
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
7
Views
2K
Top