Detecting emotional expression of music with feature selection approach

Fang Chen Hwang, Jeen-Shing Wang, Pau-Choo Chung, Ching Fang Yang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

This paper presents a mechanism on detecting emotional expression of music with feature selection approach. Happiness, sadness, anger, and peace are considered in the classification problem. The thirty-seven features were extracted to represent the characteristics of music samples, such as rhythm, dynamic, pitch, and timbre features. The kernel-based class separability (KBCS) was introduced to prioritize features for emotion classification because not all features have the same importance in achieving emotional expression. Two feature transformation techniques, principal component analysis (PCA) and linear discriminant analysis (LDA) were applied after the feature selection. The inclusion of these two techniques can effectively improve the classification accuracy. To the end, the k-nearest neighborhood (k-NN) classifier is adopted. The results indicate that the proposed method in the study can achieve accuracy at almost 90%.

Original languageEnglish
Title of host publicationICOT 2013 - 1st International Conference on Orange Technologies
Pages282-286
Number of pages5
DOIs
Publication statusPublished - 2013 Jul 12
Event1st International Conference on Orange Technologies, ICOT 2013 - Tainan, Taiwan
Duration: 2013 Mar 122013 Mar 16

Publication series

NameICOT 2013 - 1st International Conference on Orange Technologies

Other

Other1st International Conference on Orange Technologies, ICOT 2013
CountryTaiwan
CityTainan
Period13-03-1213-03-16

Fingerprint

Feature extraction
Discriminant analysis
Principal component analysis
Classifiers

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications

Cite this

Hwang, F. C., Wang, J-S., Chung, P-C., & Yang, C. F. (2013). Detecting emotional expression of music with feature selection approach. In ICOT 2013 - 1st International Conference on Orange Technologies (pp. 282-286). [6521213] (ICOT 2013 - 1st International Conference on Orange Technologies). https://doi.org/10.1109/ICOT.2013.6521213
Hwang, Fang Chen ; Wang, Jeen-Shing ; Chung, Pau-Choo ; Yang, Ching Fang. / Detecting emotional expression of music with feature selection approach. ICOT 2013 - 1st International Conference on Orange Technologies. 2013. pp. 282-286 (ICOT 2013 - 1st International Conference on Orange Technologies).
@inproceedings{0a457d6ce9c047a0a5550c0613b4c910,
title = "Detecting emotional expression of music with feature selection approach",
abstract = "This paper presents a mechanism on detecting emotional expression of music with feature selection approach. Happiness, sadness, anger, and peace are considered in the classification problem. The thirty-seven features were extracted to represent the characteristics of music samples, such as rhythm, dynamic, pitch, and timbre features. The kernel-based class separability (KBCS) was introduced to prioritize features for emotion classification because not all features have the same importance in achieving emotional expression. Two feature transformation techniques, principal component analysis (PCA) and linear discriminant analysis (LDA) were applied after the feature selection. The inclusion of these two techniques can effectively improve the classification accuracy. To the end, the k-nearest neighborhood (k-NN) classifier is adopted. The results indicate that the proposed method in the study can achieve accuracy at almost 90{\%}.",
author = "Hwang, {Fang Chen} and Jeen-Shing Wang and Pau-Choo Chung and Yang, {Ching Fang}",
year = "2013",
month = "7",
day = "12",
doi = "10.1109/ICOT.2013.6521213",
language = "English",
isbn = "9781467359368",
series = "ICOT 2013 - 1st International Conference on Orange Technologies",
pages = "282--286",
booktitle = "ICOT 2013 - 1st International Conference on Orange Technologies",

}

Hwang, FC, Wang, J-S, Chung, P-C & Yang, CF 2013, Detecting emotional expression of music with feature selection approach. in ICOT 2013 - 1st International Conference on Orange Technologies., 6521213, ICOT 2013 - 1st International Conference on Orange Technologies, pp. 282-286, 1st International Conference on Orange Technologies, ICOT 2013, Tainan, Taiwan, 13-03-12. https://doi.org/10.1109/ICOT.2013.6521213

Detecting emotional expression of music with feature selection approach. / Hwang, Fang Chen; Wang, Jeen-Shing; Chung, Pau-Choo; Yang, Ching Fang.

ICOT 2013 - 1st International Conference on Orange Technologies. 2013. p. 282-286 6521213 (ICOT 2013 - 1st International Conference on Orange Technologies).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Detecting emotional expression of music with feature selection approach

AU - Hwang, Fang Chen

AU - Wang, Jeen-Shing

AU - Chung, Pau-Choo

AU - Yang, Ching Fang

PY - 2013/7/12

Y1 - 2013/7/12

N2 - This paper presents a mechanism on detecting emotional expression of music with feature selection approach. Happiness, sadness, anger, and peace are considered in the classification problem. The thirty-seven features were extracted to represent the characteristics of music samples, such as rhythm, dynamic, pitch, and timbre features. The kernel-based class separability (KBCS) was introduced to prioritize features for emotion classification because not all features have the same importance in achieving emotional expression. Two feature transformation techniques, principal component analysis (PCA) and linear discriminant analysis (LDA) were applied after the feature selection. The inclusion of these two techniques can effectively improve the classification accuracy. To the end, the k-nearest neighborhood (k-NN) classifier is adopted. The results indicate that the proposed method in the study can achieve accuracy at almost 90%.

AB - This paper presents a mechanism on detecting emotional expression of music with feature selection approach. Happiness, sadness, anger, and peace are considered in the classification problem. The thirty-seven features were extracted to represent the characteristics of music samples, such as rhythm, dynamic, pitch, and timbre features. The kernel-based class separability (KBCS) was introduced to prioritize features for emotion classification because not all features have the same importance in achieving emotional expression. Two feature transformation techniques, principal component analysis (PCA) and linear discriminant analysis (LDA) were applied after the feature selection. The inclusion of these two techniques can effectively improve the classification accuracy. To the end, the k-nearest neighborhood (k-NN) classifier is adopted. The results indicate that the proposed method in the study can achieve accuracy at almost 90%.

UR - http://www.scopus.com/inward/record.url?scp=84879874398&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84879874398&partnerID=8YFLogxK

U2 - 10.1109/ICOT.2013.6521213

DO - 10.1109/ICOT.2013.6521213

M3 - Conference contribution

SN - 9781467359368

T3 - ICOT 2013 - 1st International Conference on Orange Technologies

SP - 282

EP - 286

BT - ICOT 2013 - 1st International Conference on Orange Technologies

ER -

Hwang FC, Wang J-S, Chung P-C, Yang CF. Detecting emotional expression of music with feature selection approach. In ICOT 2013 - 1st International Conference on Orange Technologies. 2013. p. 282-286. 6521213. (ICOT 2013 - 1st International Conference on Orange Technologies). https://doi.org/10.1109/ICOT.2013.6521213