Incremental map estimation of HMMS for efficient training and improved performance

Yoshihoko Gotoh, Michael M. Hochberg, Daniel J. Mashao, Harvey F. Silverman

Research output: Contribution to journalConference articlepeer-review

8 Citations (Scopus)

Abstract

Continuous density observation hidden Markov models (CD-HMMs) have been shown to perform better than their discrete counterparts. However, because the observation distribution is usually represented with a mixture of multi-variate normal densities, the training time for a CD-HMM can be prohibitively long. This paper presents a new approach to speed-up the convergence of CD-HMM training using a stochastic, incremental variant of the EM algorithm. The algorithm randomly selects a subset of data from the training set, updates the model using maximum a posteriori estimation, and then iterates until convergence. Experimental results show that the convergence of this approach is nearly an order of magnitude faster than the standard batch training algorithm. In addition, incremental learning of the model parameters improved recognition performance compared with the batch version.

Original languageEnglish
Pages (from-to)457-460
Number of pages4
JournalProceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing
Volume1
Publication statusPublished - 1995
Externally publishedYes
EventProceedings of the 1995 20th International Conference on Acoustics, Speech, and Signal Processing. Part 1 (of 5) - Detroit, MI, USA
Duration: 9 May 199512 May 1995

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Incremental map estimation of HMMS for efficient training and improved performance'. Together they form a unique fingerprint.

Cite this