HMM (Hidden Markov Matrix) Evaluation

  • Context: Graduate 
  • Thread starter Thread starter jiapei100
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary
SUMMARY

The discussion focuses on Hidden Markov Model (HMM) evaluation methods, specifically the forward algorithm, backward algorithm, and forward-backward algorithm. The user, JIA, implemented an HMM with 2 hidden states and 3 observations, reporting a slight discrepancy in probabilities calculated by the forward (0.0090888) and backward (0.0090888) algorithms. This difference is attributed to potential precision issues in computation. The user seeks clarification on whether this is a common occurrence or indicative of a coding error.

PREREQUISITES
  • Understanding of Hidden Markov Models (HMM)
  • Familiarity with forward and backward algorithms
  • Knowledge of probability theory and state transition matrices
  • Experience with numerical precision in computational algorithms
NEXT STEPS
  • Research "Hidden Markov Model forward algorithm implementation"
  • Study "Hidden Markov Model backward algorithm mechanics"
  • Explore "Numerical precision issues in computational algorithms"
  • Examine "Forward-backward algorithm for HMM evaluation"
USEFUL FOR

Data scientists, machine learning practitioners, and statisticians interested in HMM evaluation techniques and numerical precision in algorithm implementations.

jiapei100
Messages
4
Reaction score
0
Hi, all:

about HMM Evaluation question:

There are 3 methods to carry out HMM evaluation.
1) forward algorithm
2) backward algorithm
3) forward-backward algorithm

Sometimes, forward algorithm and backward algorithm may not give out the same result.
Can anybody (mathematician) help to explain it clearly?

I designed my data as: 2 hidden states, 3 observations, and the sequence if of length 4

1) initial state probability of state 1 and 2: 0.6, 0.4 sequentially

2) transition probability :
from state 1 to state 1: 0.7
from state 1 to state 2: 0.3
from state 2 to state 1: 0.4
from state 2 to state 2: 0.6

3) observation probability:
from state 1 to observation 1: 0.1
from state 1 to observation 2: 0.4
from state 1 to observation 3: 0.5
from state 2 to observation 1: 0.6
from state 2 to observation 2: 0.3
from state 2 to observation 3: 0.1

4) the observation sequence is known as: 0->1->2->
that is
observation 1 to observation 2 to observation 3 to observation 1

According to my implementation, forward algorithm got the probability as: 0.0090887999999999993
while backward algorithm got the probability as: 0.0090888000000000010

I'm wondering if this is the precision problem during the computation?
Or there are some other problems hidden in my wrong coding?
(Sorry that I didn't afford my coding at this moment,
I'm guessing Julius has its own HMM to have the above simple example computed)

The difference between two probabilities using my HMM looks like a precision issue,
but I'm just not certain about this.

Can anybody give a hand to confirm this?

Cheers
JIA
 
Physics news on Phys.org

Similar threads

  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 45 ·
2
Replies
45
Views
6K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
0
Views
2K