/plushcap/analysis/deepgram/hidden-markov-models

Why Hidden Markov Models (HMMs) are like vintage cars

What's this blog post about?

The article discusses the potential benefits and drawbacks of using Hidden Markov Models (HMMs) in speech recognition applications, comparing them to vintage cars. Despite their proven effectiveness, HMMs are often overlooked in favor of newer, more complex models like RNN-transducers. The author suggests that hybrid machine learning architectures and models, including HMM-based systems, deserve more attention due to their efficiency and practicality. One reason for the lack of enthusiasm towards HMMs is the clunky pipeline associated with them, which can be discouraging for developers. Additionally, there isn't a unified Python library for HMMs like there is for modern machine learning frameworks such as PyTorch or Tensorflow. The author calls for an imperative codebase and a unified codebase to make HMM-based approaches more accessible and attractive to developers. The article concludes by encouraging developers to explore older models, identify gaps, and consider upgrading them if necessary. It emphasizes the importance of not dismissing HMMs outright due to their age or simplicity, as they can still be effective in many applications.

Company
Deepgram

Date published
Nov. 3, 2023

Author(s)
Ben Luks

Word count
1296

Hacker News points
None found.

Language
English


By Matt Makai. 2021-2024.