HMM with continuous observation - PDFs to probabilities

Click For Summary
SUMMARY

This discussion focuses on the transition from discrete-observation Hidden Markov Models (HMMs) to continuous-observation HMMs, specifically addressing the implications of using probability density functions (PDFs) instead of probabilities. The key issue raised is that when employing PDFs, the forward and backward coefficients in the HMM become products of PDFs, leading to higher-order density units, which complicates the interpretation of parameters. The Baum-Welch algorithm is mentioned as not being affected by these units, but the author expresses concern over the breakdown of parameter interpretation.

PREREQUISITES
  • Understanding of Hidden Markov Models (HMMs)
  • Familiarity with probability density functions (PDFs)
  • Knowledge of Gaussian distributions
  • Experience with the Baum-Welch algorithm for parameter estimation
NEXT STEPS
  • Study Rabiner's 1989 tutorial on Hidden Markov Models
  • Explore the implications of using Gaussian mixture models in HMMs
  • Research the mathematical foundations of probability densities and their applications in HMMs
  • Investigate alternative methods for parameter estimation in continuous-observation HMMs
USEFUL FOR

Data scientists, machine learning practitioners, and researchers working with Hidden Markov Models, particularly those focusing on continuous observation and parameter estimation techniques.

rynlee
Messages
44
Reaction score
0
So I am working with a Hidden Markov Model with continuous observation, and something has been bothering me that I am hoping someone might be able to address.

Going from a discrete-observation HMM to continuous-observation HMM is actually quite straightforward (for example see Rabiner's 1989 tutorial on HMM). You just change the probability of observing a symbol k in state i, bi(k), to a PDF of a distribution (typically Gaussian) bi(vk)around some mean value.

Here's the thing, if you do that, then the forward and backward coefficients become products of PDFs, instead of products of probabilities, which gives them increasingly higher-order density units. That is quite disconcerting, for example if you use the Baum formalism for re-estimating the parameters defining the HMM, training against some data set, then you end up setting the probability of observing each state in the first timestep, πi, equal to a PDF (well a product of two PDFs, again see Rabiner), which doesn't make any sense...

Looking at several papers in the literature however, that seems to be what people are doing, nobody comments on the shift from probabilities to probability densities, so I'm hoping that someone can explain how this situation is OK/what I am missing.

Thank you!
 
Physics news on Phys.org
rynlee said:
... if you do that, then the forward and backward coefficients become products of PDFs, instead of products of probabilities, which gives them increasingly higher-order density units...
That doesn't quite sound right, can you give an example?
 
Sure, suppose your Gaussian mixture model only has one mixed state for the sake of argument:

N(mu_i, sigma_i, Ot) = 1/(sqrt(2*pi)*sigma_i) * exp(-(Ot-mu_i)^2/(2*sigma_i^2) )

and
b_i(Ot) = N(mu_i, sigma_i, Ot)

for each state 1<= i <= N

Because of the leading term in the gaussian, b_i(Ot) has units 1/sigma, as we would expect for a probability density.

So when you calculate alpha inductively,

alpha_j(t) = sum[alpha_i(t-1) * a_ij, i=1:N] * b_j(Ot)

each successive alpha has units of (1/sigma)^t. That is opposed to alpha being a unitless probability.

Now, the Baum-Welch algorithm shouldn't care if alpha and beta have units or not, since it's just looking for relative quantities, but it remains highly disconcerting as the interpretations of the parameters breaks down.
 

Similar threads

Replies
4
Views
2K
  • · Replies 21 ·
Replies
21
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 4 ·
Replies
4
Views
5K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K