TY - GEN
T1 - Analysis of expressive musical terms in violin using score-informed and expression-based audio features
AU - Li, Pei Ching
AU - Su, Li
AU - Yang, Yi Hsuan
AU - Su, Alvin W.Y.
PY - 2015/1/1
Y1 - 2015/1/1
N2 - The manipulation of different interpretational factors, including dynamics, duration, and vibrato, constitutes the realization of different expressions in music. Therefore, a deeper understanding of the workings of these factors is critical for advanced expressive synthesis and computer-aided music education. In this paper, we propose the novel task of automatic expressive musical term classification as a direct means to study the interpretational factors. Specifically, we consider up to 10 expressive musical terms, such as Scherzando and Tranquillo, and compile a new dataset of solo violin excerpts featuring the realization of different expressive terms by different musicians for the same set of classical music pieces. Under a score-informed scheme, we design and evaluate a number of note-level features characterizing the interpretational aspects of music for the classification task. Our evaluation shows that the proposed features lead to significantly higher classification accuracy than a baseline feature set commonly used in music information retrieval tasks. Moreover, taking the contrast of feature values between an expressive and its corresponding non-expressive version (if given) of a music piece greatly improves the accuracy in classifying the presented expressive one. We also draw insights from analyzing the feature relevance and the class-wise accuracy of the prediction.
AB - The manipulation of different interpretational factors, including dynamics, duration, and vibrato, constitutes the realization of different expressions in music. Therefore, a deeper understanding of the workings of these factors is critical for advanced expressive synthesis and computer-aided music education. In this paper, we propose the novel task of automatic expressive musical term classification as a direct means to study the interpretational factors. Specifically, we consider up to 10 expressive musical terms, such as Scherzando and Tranquillo, and compile a new dataset of solo violin excerpts featuring the realization of different expressive terms by different musicians for the same set of classical music pieces. Under a score-informed scheme, we design and evaluate a number of note-level features characterizing the interpretational aspects of music for the classification task. Our evaluation shows that the proposed features lead to significantly higher classification accuracy than a baseline feature set commonly used in music information retrieval tasks. Moreover, taking the contrast of feature values between an expressive and its corresponding non-expressive version (if given) of a music piece greatly improves the accuracy in classifying the presented expressive one. We also draw insights from analyzing the feature relevance and the class-wise accuracy of the prediction.
UR - http://www.scopus.com/inward/record.url?scp=84989244870&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84989244870&partnerID=8YFLogxK
M3 - Conference contribution
T3 - Proceedings of the 16th International Society for Music Information Retrieval Conference, ISMIR 2015
SP - 809
EP - 815
BT - Proceedings of the 16th International Society for Music Information Retrieval Conference, ISMIR 2015
A2 - Muller, Meinard
A2 - Wiering, Frans
PB - International Society for Music Information Retrieval
T2 - 16th International Society for Music Information Retrieval Conference, ISMIR 2015
Y2 - 26 October 2015 through 30 October 2015
ER -