Is This Process Markovian and What Is Its Transition Matrix?

  • Thread starter Thread starter russel
  • Start date Start date
russel
Messages
13
Reaction score
0
Hello there,
I'm stuck at a problem on markov chains... Could anyone help?
Here it is:
There are two machines that operate or don't during a day. Let $$X(n)$$ be the number of machines operating during the n-th day. Every machine is independently operating during the n-th day with probability $$\frac{1+X(n-1)}{4}$$. Show that the process $$\{ X(n):n=1,2,...\}$$ is markovian and find its transition matrix.
 
Physics news on Phys.org
Hey russel and welcome to the forums.

You need to show us what you've tried and what you are thinking.

Markov chains primarily have to have valid probabilities and then need to satisfy 1st order conditional dependence.

So the first hint is that first order conditional independence says that P(X1 = a, X2 = b, X3 = c, blah blah, Xn = whatever | X0 = x) = P(X1 = a|X0 = x). In other words only the current state depends on what happened just before and no more.

What have you thought about for this expression?
 
I tried to prove that P(X(n+1) | X(1),...,X(n))=P(X(n+1) | X(n)) but I can't. Must I use the P(A|B)=P(A,B)/P(B) ? Is it easier to prove that has independent increments and, therefore, it's markovian? The funny part is I found the transition matrix...
Could I just say that the probability X(n) depends only on X(n-1) (if you see the given equation), so it is markovian?
 
russel said:
I tried to prove that P(X(n+1) | X(1),...,X(n))=P(X(n+1) | X(n)) but I can't. Must I use the P(A|B)=P(A,B)/P(B) ? Is it easier to prove that has independent increments and, therefore, it's markovian? The funny part is I found the transition matrix...
Could I just say that the probability X(n) depends only on X(n-1) (if you see the given equation), so it is markovian?

You can equate this long expression with your (X(n) - 1)/4 and then show that only X(n) matters which cancels all the rest of the terms.

Usually what happens in the independent cases is that if something is independent then P(C and D) = P(C)P(D). In many of these proofs you get an expression with you get P(E)P(D)/[P(C)P(D)] = P(E)/P(C) and P(E) = P(X1 = a, X0 = b) and P(C) = P(X0 = b).

In your case though you are given that it only depends on X(n) so you can just use this result directly, but in the future it's important to note that for many Markovian problems you will need to show independence like in the above scenario so that you get the right cancellation.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Back
Top