MHB Hidden Markov Model Calculation

  • Thread starter Thread starter betsyrocamora
  • Start date Start date
  • Tags Tags
    Calculation Model
betsyrocamora
Messages
2
Reaction score
0
Can someone help me with this question.

Let us define a Markov hidden model with 2-states A and B, such that

P(X0=A)=0.6,P(X0=B)=0.4,P(X1=A/X0=A)=0.3,P(X1=B/X0=B)=0.8

what is the value of P(X3=A) ??
 
Last edited:
Physics news on Phys.org
Hello betsyrocamora,

Welcome to MHB! (Wave)

Can you show what you have tried and where you are stuck so our helpers have a better idea how best to help you?
 
betsyrocamora said:
Can someone help me with this question.

Let us define a Markov hidden model with 2-states A and B, such that

P(X0=A)=0.6,P(X0=B)=0.4,P(X1=A/X0=A)=0.3,P(X1=B/X0=B)=0.8

what is the value of P(X3=A) ??

Welcome to MHB, betsyrocamora! :)

This looks like a question that is intended to learn what a hidden Markov model actually is.
Do your notes perhaps contain a worked example?
Or perhaps an example for a Markov chain?
 
I like Serena said:
Welcome to MHB, betsyrocamora! :)

This looks like a question that is intended to learn what a hidden Markov model actually is.
Do your notes perhaps contain a worked example?
Or perhaps an example for a Markov chain?

No, I know what is a hidden markov model, but with this one I am a little lost, I have tried to let j denote the state A. Used p3ij:=P(X3=j|X0=i) that satisfies the Chapman Kolmogorov equations and after that I think is the total probabilities ecuation but I am lost in there hahha
 
betsyrocamora said:
No, I know what is a hidden markov model, but with this one I am a little lost, I have tried to let j denote the state A. Used p3ij:=P(X3=j|X0=i) that satisfies the Chapman Kolmogorov equations and after that I think is the total probabilities ecuation but I am lost in there hahha

It seems to me you're making it unnecessarily complex.

From your given data we can deduce that $P(X_1=B\ |\ X_0=A)=0.7$ and $P(X_1=A\ |\ X_0=B)=0.2$.
You can write these numbers in a matrix M.
Then, assuming the Markov property being independent of n, we get:
$$\begin{bmatrix}{P(X_n=A) \\ P(X_n=B)}\end{bmatrix} = M^n \begin{bmatrix}{P(X_0=A) \\ P(X_0=B)}\end{bmatrix}$$
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top