On-line estimation of hidden Markov model parameters

Jun Mizuno, Tatsuya Watanabe, Kazuya Ueki, Kazuyuki Amano, Eiji Takimoto, Akira Maruoka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Citations (Scopus)

Abstract

In modeling various signals such as the speech signal by using the Hidden Markov Model (HMM), it is often required to adapt not only to the inherent nonstationarity of the signal, but to changes of sources (speakers) who yield the signal. The well known Baum-Welch algorithm tries to adjust HMM so as to optimize the fit between the model and the signal observed. In this paper we develop an algorithm, which we call the on-line Baum-Welch algorithm, by incorporating the learning rate into the off-line Baum-Welch algorithm. The algorithm performs in a series of trials. In each trial the algorithm somehow produces an HMM Mt, then receives a symbol sequence wt, incurring loss -ln Pr(wt|Mt) which is the negative log-likelihood of the HMM Mt evaluated at wt. The performance of the algorithm is measured by the additional total loss, which is called the regret, of the algorithm over the total loss of a standard algorithm, where the standard algorithm is taken to be a criterion for measuring the relative loss. We take the off-line Baum-Welch algorithm as such a standard algorithm. To evaluate the performance of an algorithm, we take the Gradient Descent algorithm. Our experiments show that the on-line Baum-Welch algorithm performs well as compared to the Gradient Descent algorithm. We carry out the experiments not only for artificial data, but for some reasonably realistic data which is made by transforming acoustic waveforms to symbol sequences through the vector quantization method. The results show that the on-line Baum-Welch algorithm adapts the change of speakers very well.

Original languageEnglish
Title of host publicationDiscovery Science - 3rd International Conference, DS 2000, Proceedings
EditorsSetsuo Arikawa, Shinichi Morishita
PublisherSpringer Verlag
Pages155-169
Number of pages15
ISBN (Print)9783540413523
DOIs
Publication statusPublished - 2000
Event3rd International Conference on Discovery Science, DS 2000 - Kyoto, Japan
Duration: 2000 Dec 42000 Dec 6

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume1967
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference3rd International Conference on Discovery Science, DS 2000
Country/TerritoryJapan
CityKyoto
Period00-12-0400-12-06

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'On-line estimation of hidden Markov model parameters'. Together they form a unique fingerprint.

Cite this