Solving Markov Chain Problem for Proportions in Areas A, B & C

  • Thread starter Thread starter subopolois
  • Start date Start date
  • Tags Tags
    Chain Markov chain
Click For Summary

Homework Help Overview

The discussion revolves around a Markov chain problem where the original poster seeks to determine the proportion of time a fox spends in three territories (A, B, and C) based on a word problem. The poster has attempted to create a transition matrix but is struggling with its validity and the concept of stationary distributions.

Discussion Character

  • Exploratory, Assumption checking, Problem interpretation

Approaches and Questions Raised

  • Participants discuss the validity of the transition matrix, questioning whether the rows sum to one. There is an exploration of how to derive the initial transition matrix from the word problem provided.

Discussion Status

The discussion is ongoing, with participants providing feedback on the transition matrix and clarifying concepts related to stationary distributions. Some guidance has been offered regarding the formulation of the state probabilities and the conditions for steady-state.

Contextual Notes

There is mention of confusion regarding the transition matrix setup, particularly in relation to the textbook examples, which may differ in how they present the summation of probabilities. The original poster has indicated they do not need help with the final questions but rather with understanding the initial setup.

subopolois
Messages
81
Reaction score
0

Homework Statement


i have a scenario which i have to find the proportion of time spent in each area by a person using markov chains. i was given a word problem, which i have put into a matrix and the question asks what the proportion of time is spent in each area A, B and C.


Homework Equations



na

The Attempt at a Solution


i have the final matrix as:
0 0.34 0.67
0 0 0.34
1 0.34 0

im stuck at how to determine how to do this, can someone help?
 
Physics news on Phys.org
Find the stationary distribution of the Markov chain (that is an eigen vector to the transition matrix)
 
Pere Callahan said:
Find the stationary distribution of the Markov chain (that is an eigen vector to the transition matrix)

we haven't been taught eigen vectors yet
 
subopolois said:
i have the final matrix as:
0 0.34 0.67
0 0 0.34
1 0.34 0
That is not a valid transition matrix. Each row must sum to 1.
 
D H said:
That is not a valid transition matrix. Each row must sum to 1.

sorry, i made a typo, its:
0 0.67 0.67
0 0 0.34
1 0.33 0
 
That's still not valid. In fact, it's worse; now the first row is also invalid. The sum of each row must be identically one. What is the word problem that led to this matrix?
 
heres the word problem:
A fox hunts in three territories A, B and C. He
never hunts in the same territory on two successive days.
If he hunts in A, then he hunts in C the next day. If he
hunts in B or C, he is twice as likely to hunt in A the next
day as in the other territory.

i only gave the scenerio, i don't need help with the actual questions, just how to get the initial transition mtrix. its just confusing for me reading the problem and transfering it to a mathematica matrix. i am going by the examples in the textbook in terms of the rows adding up to 1, in my textbook it has the columns adding to 1
 
Sorry for the misdirection. Your corrected matrix is fine.

Suppose the probabilities that the fox at the nth time step tn is in state A, B, or C are PA(tn), PB(tn), and PC(tn). Let P(tn) be the column vector formed from these individual state probabilities. The state probabilities at the next time step are given by the transition matrix S: P(tn+1)=S×P(tn).

The system is in steady-state if P(tn+1)=P(tn). Try to find this steady-state probability vector (the components must add to one).
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
Replies
9
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
24
Views
4K