Automatic violin synthesis using expressive musical term features

Chih Hong Yang, Pei Ching Li, Alvin W.Y. Su, Li Su, Yi Hsuan Yang

研究成果: Conference article同行評審

6 引文 斯高帕斯(Scopus)

摘要

The control of interpretational properties such as duration, vibrato, and dynamics is important in music performance. Musicians continuously manipulate such properties to achieve different expressive intentions. This paper presents a synthesis system that automatically converts a mechanical, deadpan interpretation to distinct expressions by controlling these expressive factors. Extending from a prior work on expressive musical term analysis, we derive a subset of essential features as the control parameters, such as the relative time position of the energy peak in a note and the mean temporal length of the notes. An algorithm is proposed to manipulate the energy contour (i.e. for dynamics) of a note. The intended expressions of the synthesized sounds are evaluated in terms of the ability of the machine model developed in the prior work. Ten musical expressions such as Risoluto and Maestoso are considered, and the evaluation is done using held-out music pieces. Our evaluations show that it is easier for the machine to recognize the expressions of the synthetic version, comparing to those of the real recordings of an amateur student. While a listening test is under construction as a next step for further performance validation, this work represents to our best knowledge a first attempt to build and quantitatively evaluate a system for EMT analysis/synthesis.

原文English
頁(從 - 到)209-215
頁數7
期刊Proceedings of the International Conference on Digital Audio Effects, DAFx
出版狀態Published - 2016
事件19th International Conference on Digital Audio Effects, DAFx 2016 - Brno, Czech Republic
持續時間: 2016 9月 52016 9月 9

All Science Journal Classification (ASJC) codes

  • 電腦科學應用
  • 訊號處理
  • 音樂

指紋

深入研究「Automatic violin synthesis using expressive musical term features」主題。共同形成了獨特的指紋。

引用此