Hi,(adsbygoogle = window.adsbygoogle || []).push({});

I try to teach myself Hidden Markov Models. I am using this text

"www.cs.sjsu.edu/~stamp/RUA/HMM.pdf" [Broken] as Material. The Introduction with

the example was reasonable but now I have trouble in unterstanding

some of the derivation.

I can follow the math and use the formulars to get results but

I also want to understand the meaning behind it.

One question that arise at hidden markov models is to determine

the likelihood of the oberserved sequence O with the given Model

as written below:

b = probability of the observation

a = transiting from one state to another

[itex]\pi[/itex] = starting probability

O = observation

X = state sequence

[itex]\lambda[/itex] = hidden markov model

( Section 4, Page 6 in the Text )

[itex]P(O | X, \lambda) = b_{x0}()* b_ {x1}() *\cdots* b_{xT-1}(O_{T-1}) [/itex]

[itex]P(X|\lambda) = \pi_{x_0}a_{x_0,x_1}a_{x_1,x_2}*\cdots* a_{x_{T-1},x_{T-2}}[/itex]

[itex]P(O, X|\lambda) = \frac{P(O \cap X \cap \lambda)}{P( \lambda)}[/itex]

[itex]P(O | X, \lambda) *P(X|\lambda) = \frac{P(O \cap X \cap \lambda)}{P(X \cap \lambda)} \frac{P(X \cap \lambda)}{P(\lambda)}=\frac{P(O \cap X \cap \lambda)}{P(\lambda)}[/itex]

[itex]P(O, X|\lambda) = P(O | X, \lambda) * P(X|\lambda) [/itex]

[itex]P(O | \lambda) = \sum\limits_{X} P(O,X|\lambda)[/itex]

[itex]P(O | \lambda) = \sum\limits_{X} P(O | X, \lambda) * P(X|\lambda)[/itex]

My question is: Why do I get the likelihood of the oberserved sequence

by summing up over all all possible state sequences. Can someone please

explain it differently?

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Hidden Markov Models - Likelihood of

**Physics Forums | Science Articles, Homework Help, Discussion**