Markov chain Definition and 87 Threads
-
G
Stationary distribution of a Markov chain
Homework Statement find the stationary distribtion of ##\left( \begin{array}{ccccc} \frac{1}{2} & \frac{1}{2} & 0 & 0 & 0 \\ \frac{1}{3} & \frac{2}{3} & 0 & 0 & 0 \\ 0 & \frac{1}{3} & \frac{2}{3} & 0 & 0 \\ 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 1 & 0 \end{array} \right)##Homework Equations...- Gregg
- Thread
- Chain Distribution Markov chain
- Replies: 2
- Forum: Calculus and Beyond Homework Help
-
L
What Are the Markov Chain Probabilities for a Taxicab's Moves?
Homework Statement A taxicab moves between the airport, Hotel A, and Hotel B according to a Markov chain with transition probabilities: P(airport → A) = 0.7, P(airport → B) = 0.3, P(A → airport) = 0.9, P(A → B) = 0.1, P(B → airport) = 0.8, P(B → A) = 0.2. A-If the taxicab starts...- lina29
- Thread
- Chain Markov chain Statistics
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
R
Find Lambda for Simple Walk on Z (Markov Chain)
Homework Statement Find \lambda= \{\lambda_i\} with \lambda_0 = 1 if P(X_n=i+1|X_n=i) = p, P(X_n = i-1|X_n = i) = 1 - p, i \in Z, 0<p<1 Homework Equations \lambda P = \lambda The Attempt at a Solution I used my relevant equation to write out: (1-p)\lambda_0 + P\lambda_{-2} =...- RoshanBBQ
- Thread
- Chain Lambda Markov chain
- Replies: 7
- Forum: Calculus and Beyond Homework Help
-
H
Concept question on irreducibility (markov chain)
Homework Statement This is not really a homework question but just something that I'm confused about. I'm having trouble with understanding the movement of Markov chain. Say, if S = {0,1,2...} and 0<q<1. Let p(i,i+1)=q and p(i,i-1)=1-q. I can see how this Markov chain moves (back and forth...- hsinyihhsu
- Thread
- Chain Concept Markov chain
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
N
[markov chain] prove that probability equals 6/pi²
Homework Statement Homework Equations N/A The Attempt at a Solution I'll shortly explain what my reasoning is so far, but please ignore it if it comes across too jumbled: --- Let P denote the markov matrix associated with this problem, then I think I was able to argue that the...- nonequilibrium
- Thread
- Chain Markov chain Probability
- Replies: 3
- Forum: Calculus and Beyond Homework Help
-
S
Markov chain, sum of N dice rolls
Question : Let Xn be the maximum score obtained after n throws of a fair dice a) Prove that Xn is a markov chain and write down the transition matrix Im having a problem starting the transition matrix im assuming the states are meant to be the sum. then do you write out the transition...- simba_
- Thread
- Chain Dice Markov chain Sum
- Replies: 4
- Forum: Set Theory, Logic, Probability, Statistics
-
J
Solving Markov Chain Question on Two Switches
Hi, I need help with answering this question. Firstly, I'm not sure what the transition matrix should like. Should there be 2 states? One where both switches are off and one where both switches are on? The question is: Suppose that each of 2 switches is either on or off during the day. On...- Jenny123
- Thread
- Chain Markov chain
- Replies: 1
- Forum: Set Theory, Logic, Probability, Statistics
-
J
Gamblers Ruin - Markov Chain problem
This is probably a noob question, background in probability theory isn't great but I was shown this problem in a lecture: "Suppose a gambler starts out with £n, and makes a series of £1 bets against the house. Let the probability of winning each bet be p, and of loosing be q = 1 − p. If...- JesseC
- Thread
- Chain Markov chain
- Replies: 24
- Forum: Set Theory, Logic, Probability, Statistics
-
T
Why Do Lines 3 and 4 Equate in Random Walk Probability Calculations?
Suppose X is a random walk with probability P(X_k=+1)=p and P(X_k=-1)=q=1-p and S_n=X_1+X_2+...+X_n Can anyone explain why does line 3 equal to line 4? P(S_k-S_0≠0 ,S_k-S_1≠0 ,…,S_k-S_{k-1}≠0) =P(X_k+X_{k-1}+⋯+X_1≠0 ,X_k+X_{k-1}+⋯+X_2≠0 ,…,X_k≠0) =P( X_k≠0 ,X_k+X_{k-1}≠0...- tanzl
- Thread
- Chain Markov chain Random Random walk
- Replies: 3
- Forum: Set Theory, Logic, Probability, Statistics
-
L
How to Generate a 100 State Sequence Using a Markov Chain Model?
I have a state transition probability matrix and a state probability vector [0.9 0.1; 0.1 0.9] & [0.4 0.6] respectively. Now, I want to generate the states of 1 and 0 according to this. say 100 state sequence. Any sort of help would be appreciated. Thanks.- lemontree45
- Thread
- Chain Markov chain Model State
- Replies: 6
- Forum: Set Theory, Logic, Probability, Statistics
-
X
Persistence of State i in a Markov Chain
In a Markov chain, show that a state i is persistent if and only if the mean number of visits to the state i is infinite given the chain started in state i. I thought about looking at the mean recurrence time, but that's all I have so far.- xentity1x
- Thread
- Chain Markov chain State
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
X
Show all other states are transient in Markov chain
Let X be a Markov chain with a state s that is absorbing, i.e. pss(1) = 1. All other states communicate with s i.e. i → s for all states i ∈ S. Show that all states in S except s are transient. I understand this intuitively, but I'm not really sure how to start the proof.- xentity1x
- Thread
- Chain Markov chain States Transient
- Replies: 3
- Forum: Calculus and Beyond Homework Help
-
S
Any fast way to compute the fixed vector of a Markov chain transistion matrix?
* I have already posted this in the General Math, but I guess the problem is more like a linear algebra problem. Currently I am using a rather simple way, to solve vector w from (M-I)w=0 (replace one equation by w1+w2+...wn=1). Is there any faster way to do this? Thank you.- samuelandjw
- Thread
- Chain Markov chain Matrix Vector
- Replies: 2
- Forum: Linear and Abstract Algebra
-
J
Do Markov Chain Transition Matrices Sum by Row or Column?
Homework Statement Homework Equations The Attempt at a Solution A few things. First of all, the homework problem notes that "all the columns should sum to 1," whereas Wikipedia says ∑Pij = 1 when we sum all along the the row i. Second of all, I don't know where to...- Jamin2112
- Thread
- Chain Markov chain
- Replies: 2
- Forum: Calculus and Beyond Homework Help
-
A
Difference between martingale and markov chain
What is the difference between martingale and markov chain. As it seems apparently, if a process is a martingale, then the future expected value is dependent on the current value of the process while in markov chain the probability of future value (not the expected value) is dependent on the...- ait.abd
- Thread
- Chain Difference Markov chain martingale
- Replies: 3
- Forum: Set Theory, Logic, Probability, Statistics
-
P
Steady-state vector of infinite markov chain
Today we encountered a problem with markov chains and are wondering if this can be solved analytically. Suppose we have a banded transition probability matrix M of the following form: M= [ P P 0 0 0 ... Q 0 P 0 0 ... 0 Q 0 P 0 ... 0 0 Q 0 P ... 0 0 0 Q 0 ... . . . . . . . . . . ]...- pokey909
- Thread
- Chain Infinite Markov chain Vector
- Replies: 1
- Forum: Set Theory, Logic, Probability, Statistics
-
T
Markov Chain Conditional Expectation
Hello, in relation to Markov chains, could you please clarify the following equations: In particular, could you please expand on why the first line is equal. Surely from , along with the first equation, this implies that: I just don't see why they are all equal. Please could you...- tunaaa
- Thread
- Chain Conditional Conditional expectation Expectation Markov chain
- Replies: 3
- Forum: Set Theory, Logic, Probability, Statistics
-
S
Stationary distribution of countable Markov chain
How do I find the stationary distribution of the Markov chain with the countable state space {0, 1, 2, ..., n, ...}, where each point, including 0, can either a. return to 0 with probability 1/2, or b. move to the right n -> n+1 with probability 1/2? Thanks.- shoplifter
- Thread
- Chain Distribution Markov chain
- Replies: 1
- Forum: Set Theory, Logic, Probability, Statistics
-
S
How to Obtain the Transition Probability Matrix in a Birth Death Markov Chain?
Hi I am trying to model the behaviour of 2 independent ON-OFF sources. My state diagram is as follows state 0 = both sources are OFF state 1 = 1 of the sources are ON state 2 = both sources are ON The transition rates are given as BIRTH RATE = lamda(i) = (N-I)*lamda DEATH RATE =...- smoodliar
- Thread
- Chain Death Markov chain
- Replies: 2
- Forum: Set Theory, Logic, Probability, Statistics
-
E
Solving Markov Chain Problem for Water Distribution Co. in California
A water distribution company in southern California gets its water supply from the north and sell it back to its customers in Orange county. Assume the following simplified scheme: 3 MG (millions of gallons) of water arrives from the north at the beginning of the month. The company can store up...- economist13
- Thread
- Chain Markov chain
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
J
Markov Chain - Find the Stationary Distribution
Ok, I had a homework problem that I cannot for the life of me, figure out. I've tried to google for different sources that would show me how to find the stationary distribution of a markov chain, but I can't seem to find one that makes sense to me. The transition matrix of a markov chain is...- jgbradley1
- Thread
- Chain Distribution Markov chain
- Replies: 2
- Forum: Calculus and Beyond Homework Help
-
R
[markov chain] reading expected value from the transition matrix
Hello there, yet another trivial problem: I've attended the 'stochastic process' course some time ago but the only thing I remember is that this kind of problem is really easy to compute, there is some simple pattern for this I presume. thanks for your help, rahl.- rahl___
- Thread
- Chain Expected value Markov chain Matrix Reading Transition Transition matrix Value
- Replies: 1
- Forum: Precalculus Mathematics Homework Help
-
S
Markov Chain of Stochastic Processes
I would like to construct a model using a markov chain that has different stochastic processes for each state in the chain. Is there a term for such a thing, or anything similar to it? Thanks- stargazer_iq
- Thread
- Chain Markov chain Stochastic Stochastic processes
- Replies: 2
- Forum: Set Theory, Logic, Probability, Statistics
-
Y
How do i show that a markov chain is irreducible?
how do i show that a markov chain is irreducible?- ynotidas
- Thread
- Chain Markov chain
- Replies: 1
- Forum: Set Theory, Logic, Probability, Statistics
-
N
Exponential of (Markov Chain) Transition matrix
Hi, I have a (markov chain) transition matrix X which I understand. In particular each row of this matrix sums to 1. I have used this transition matrix to construct it's generator, Y. I.e. Y is the continuously compounded transition matrix, X = exp(Y) X*X = exp(2Y), etc both X and Y...- NewStudent200
- Thread
- Chain Exponential Markov chain Matrix Transition Transition matrix
- Replies: 5
- Forum: Linear and Abstract Algebra
-
F
How Do You Represent a Queuing System with a Markov State Transition Diagram?
Homework Statement I have the following queuing system: http://img39.imageshack.us/img39/8264/immaginetd.jpg that models voice traffic that come up with \alpha e \beta parameters, on both queue 1 and 2. When a source of voice is active causes traffic with exponential inter-arrival time which...- Fabio_vox
- Thread
- Chain Diagram Markov chain State
- Replies: 2
- Forum: Engineering and Comp Sci Homework Help
-
H
Markov Chain Homework: Expressing Transition Matrix P in Terms of G
Please Help! Markov Chain Homework Statement Let X0 be a random variable with values in a countable set I. Let Y1, Y2, ... be a sequence of independent random variables, uniformly distributed on [0, 1]. Suppose we are given a function G : I x [0, 1] -> I and define inductively for n >=...- hsong9
- Thread
- Chain Markov chain
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
A
What is the Probability That the Rat Will Survive in the Presence of the Cat?
Homework Statement Rat and Cat move between room 1 and 2 using different paths. Their motions are governed by their respective transition matrices: [0.9, 0.1 ; 0.2, 0.8] [0.6, 0.4 ; 0.3, 0.7] (semi colon is a new line in the matrix, like in matlab) If they are ever in the same room...- Andrusko
- Thread
- Chain Markov chain
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
S
Solving Markov Chain Problem for Proportions in Areas A, B & C
Homework Statement i have a scenario which i have to find the proportion of time spent in each area by a person using markov chains. i was given a word problem, which i have put into a matrix and the question asks what the proportion of time is spent in each area A, B and C. Homework...- subopolois
- Thread
- Chain Markov chain
- Replies: 7
- Forum: Calculus and Beyond Homework Help
-
J
How Do You Calculate the Probability of a Rat Being in Room 4 After Two Moves?
I've attached the diagram of 4 rooms, which a rat must move through. Each period he changes his room (his state). As you can see if you click on the image, the rat cannot access room 2 from 3, vice versa. If I assume the rat begins in room 1, how do I calculate the probability he will be in...- JeffNYC
- Thread
- Chain Markov chain
- Replies: 2
- Forum: Linear and Abstract Algebra
-
I
Markov chain calibration to a set of cumulated frequencies.
Homework Statement Hi! I have been given such a task: A population of firms can assume three states: good-bad-bankrupt (default) The cumulated frequencies of default (DP) from year 1 to 10 are given. Find an appropriate transition matrix (TM) I'm given a matrix of historical cumulated...- ibimbo
- Thread
- Calibration Chain Frequencies Markov chain Set
- Replies: 3
- Forum: Calculus and Beyond Homework Help
-
C
Markov chain - stationary distribution -
Homework Statement -A taxi is located either at the airport or in the city. From the city, the next trip is to the airport with 1/4 probability, or to somewhere else in the city with 3/4. From the airport, the next trip is always to the city. (a) find the stationary distribution (b)...- Carolyn
- Thread
- Chain Distribution Markov chain
- Replies: 1
- Forum: Calculus and Beyond Homework Help
-
?
Aperiodicity of a Markov Chain
Homework Statement Transition matrix is 0 0 1 0 0 1 (1/3) (2/3) 0 "argue that this chain is aperiodic" Homework Equations definition of aperiodicity - there must exist a time n such that there is a non-zero probability of going from state i to state j for all i & j The...- .....
- Thread
- Chain Markov chain
- Replies: 3
- Forum: Calculus and Beyond Homework Help
-
P
Stationary distribution of Markov chain
Hi all, I'm given a Markov chain Q_k, k>0 with stationary transition probabilities. The state is space uncountable. What I want to show is that the chain is asymptotically stationary, that is it converges in distribution to some random variable Q. All I have at hand is an k-independent upper...- Pere Callahan
- Thread
- Chain Distribution Markov chain
- Replies: 5
- Forum: Set Theory, Logic, Probability, Statistics
-
C
Solve Markov Chain Problem: Find Initial Probability Vector p(1)
[SOLVED] Markov chain problem Hi, all! This is not a homework thing or something like that. I'm just curious. Suppose a Markov chain p(n+1)=Q*p(n), where Q=[q11, q12; q21, q22] is the transition probability matrix and p(n)=[p1(n);p2(n)] is the probability vector. Given Q and p(1), we can...- celestra
- Thread
- Chain Markov chain
- Replies: 11
- Forum: Set Theory, Logic, Probability, Statistics
-
S
What is the Power Spectrum of a Markov Chain?
I'm reading the wikipedia article on them and I can't really get an understanding of what they are. I'm writing a paper for psychology, and I keep coming across articles that say 'learning can be modeled with markov chains' what does that mean?- seang
- Thread
- Chain Markov chain
- Replies: 3
- Forum: General Math
-
Z
Calculating Probability of Markov Chain State in Discrete Time
Let \left( {X_n } \right)_{n \ge 0} be a Markov chain (discrete time). I have {\bf{P}} = \left[ {pij} \right]_{i,j} = \left[ {P\left( {X_1 = j|X_0 = i} \right)} \right]_{i,j}, and the initial probability distribution {\bf{p}}^{\left( 0 \right)}. I need to calculate P\left(...- Zaare
- Thread
- Chain Discrete Markov chain Probability State Time
- Replies: 1
- Forum: Set Theory, Logic, Probability, Statistics