Abstract
Continuous density observation hidden Markov models (CD-HMMs) have been shown to perform better than their discrete counterparts. However, because the observation distribution is usually represented with a mixture of multi-variate normal densities, the training time for a CD-HMM can be prohibitively long. This paper presents a new approach to speed-up the convergence of CD-HMM training using a stochastic, incremental variant of the EM algorithm. The algorithm randomly selects a subset of data from the training set, updates the model using maximum a posteriori estimation, and then iterates until convergence. Experimental results show that the convergence of this approach is nearly an order of magnitude faster than the standard batch training algorithm. In addition, incremental learning of the model parameters improved recognition performance compared with the batch version.
Original language | English |
---|---|
Pages (from-to) | 457-460 |
Number of pages | 4 |
Journal | Proceedings - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing |
Volume | 1 |
Publication status | Published - 1995 |
Externally published | Yes |
Event | Proceedings of the 1995 20th International Conference on Acoustics, Speech, and Signal Processing. Part 1 (of 5) - Detroit, MI, USA Duration: 9 May 1995 → 12 May 1995 |
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering