Exploiting prosody hierarchy and dynamic features for pitch modeling and generation in HMM-based speech synthesis

Chi Chun Hsia, Chung Hsien Wu, Jung Yun Wu

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)

Abstract

This paper proposes a method for modeling and generating pitch in hidden Markov model (HMM)-based Mandarin speech synthesis by exploiting prosody hierarchy and dynamic pitch features. The prosodic structure of a sentence is represented by a prosody hierarchy, which is constructed from the predicted prosodic breaks using a supervised classification and regression tree (S-CART). The S-CART is trained by maximizing the proportional reduction of entropy to minimize the errors in the prediction of the prosodic breaks. The pitch contour of a speech sentence is estimated using the STRAIGHT algorithm and decomposed into the prosodic features (static features) at prosodic word, syllable, and frame layers, based on the predicted prosodic structure. Dynamic features at each layer are estimated to preserve the temporal correlation between adjacent units. A hierarchical prosody model is constructed using an unsupervised CART (U-CART) for generating pitch contour. Minimum description length (MDL) is adopted in U-CART training. Objective and subjective evaluations with statistical hypothesis testing were conducted, and the results compared to corresponding results for HMM-based pitch modeling. The comparison confirms the improved performance of the proposed method.

Original languageEnglish
Article number5443736
Pages (from-to)1994-2003
Number of pages10
JournalIEEE Transactions on Audio, Speech and Language Processing
Volume18
Issue number8
DOIs
Publication statusPublished - 2010

All Science Journal Classification (ASJC) codes

  • Acoustics and Ultrasonics
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Exploiting prosody hierarchy and dynamic features for pitch modeling and generation in HMM-based speech synthesis'. Together they form a unique fingerprint.

Cite this