Hidden Markov Model Calculation

In summary, from your given data it can be determined that the probability of a transition from state A to state B is .7 and the probability of a transition from state B to state A is .2.
  • #1
betsyrocamora
2
0
Can someone help me with this question.

Let us define a Markov hidden model with 2-states A and B, such that

P(X0=A)=0.6,P(X0=B)=0.4,P(X1=A/X0=A)=0.3,P(X1=B/X0=B)=0.8

what is the value of P(X3=A) ??
 
Last edited:
Physics news on Phys.org
  • #2
Hello betsyrocamora,

Welcome to MHB! (Wave)

Can you show what you have tried and where you are stuck so our helpers have a better idea how best to help you?
 
  • #3
betsyrocamora said:
Can someone help me with this question.

Let us define a Markov hidden model with 2-states A and B, such that

P(X0=A)=0.6,P(X0=B)=0.4,P(X1=A/X0=A)=0.3,P(X1=B/X0=B)=0.8

what is the value of P(X3=A) ??

Welcome to MHB, betsyrocamora! :)

This looks like a question that is intended to learn what a hidden Markov model actually is.
Do your notes perhaps contain a worked example?
Or perhaps an example for a Markov chain?
 
  • #4
I like Serena said:
Welcome to MHB, betsyrocamora! :)

This looks like a question that is intended to learn what a hidden Markov model actually is.
Do your notes perhaps contain a worked example?
Or perhaps an example for a Markov chain?

No, I know what is a hidden markov model, but with this one I am a little lost, I have tried to let j denote the state A. Used p3ij:=P(X3=j|X0=i) that satisfies the Chapman Kolmogorov equations and after that I think is the total probabilities ecuation but I am lost in there hahha
 
  • #5
betsyrocamora said:
No, I know what is a hidden markov model, but with this one I am a little lost, I have tried to let j denote the state A. Used p3ij:=P(X3=j|X0=i) that satisfies the Chapman Kolmogorov equations and after that I think is the total probabilities ecuation but I am lost in there hahha

It seems to me you're making it unnecessarily complex.

From your given data we can deduce that $P(X_1=B\ |\ X_0=A)=0.7$ and $P(X_1=A\ |\ X_0=B)=0.2$.
You can write these numbers in a matrix M.
Then, assuming the Markov property being independent of n, we get:
$$\begin{bmatrix}{P(X_n=A) \\ P(X_n=B)}\end{bmatrix} = M^n \begin{bmatrix}{P(X_0=A) \\ P(X_0=B)}\end{bmatrix}$$
 

1. What is a Hidden Markov Model (HMM)?

A Hidden Markov Model is a statistical model used to predict the probability of a sequence of unobservable or hidden states based on a sequence of observable outcomes. It is used in a variety of fields, including speech recognition, pattern recognition, and bioinformatics.

2. How does a Hidden Markov Model work?

A Hidden Markov Model works by using a set of observed data to estimate the probability of a sequence of hidden states. This is done through a process called the forward-backward algorithm, which calculates the probability of each hidden state at each time step based on the previous state and the observed data.

3. What is the difference between a Hidden Markov Model and a regular Markov Model?

The main difference between a Hidden Markov Model and a regular Markov Model is that in a Hidden Markov Model, the states are not directly observable. Instead, they are inferred from the observed data. This makes HMMs more flexible and useful for modeling complex processes.

4. What are some common applications of Hidden Markov Models?

Hidden Markov Models have a wide range of applications, including speech recognition, handwriting recognition, protein structure prediction, and financial market analysis. They are also commonly used in natural language processing tasks such as part-of-speech tagging and language generation.

5. What are the limitations of Hidden Markov Models?

While Hidden Markov Models are useful in many scenarios, they do have some limitations. They assume that the underlying system is stationary and that the observations are independent of each other. They also require a large amount of training data and can struggle with long sequences of data. Additionally, HMMs are limited in their ability to capture complex relationships between hidden states and observations.

Similar threads

  • Programming and Computer Science
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
973
  • Programming and Computer Science
Replies
2
Views
2K
Back
Top