TY - JOUR
T1 - Automatic ECG-Based Emotion Recognition in Music Listening
AU - Hsu, Yu Liang
AU - Wang, Jeen Shing
AU - Chiang, Wei Chun
AU - Hung, Chien Han
N1 - Funding Information:
This work was supported by the Ministry of Science and Technology of the Republic of China, Taiwan, under Grant No. MOST 106-3011-E-006-002 and MOST 106-2221-E-035-004.
Publisher Copyright:
© 2010-2012 IEEE.
PY - 2020/1/1
Y1 - 2020/1/1
N2 - This paper presents an automatic ECG-based emotion recognition algorithm for human emotion recognition. First, we adopt a musical induction method to induce participants' real emotional states and collect their ECG signals without any deliberate laboratory setting. Afterward, we develop an automatic ECG-based emotion recognition algorithm to recognize human emotions elicited by listening to music. Physiological ECG features extracted from the time-, and frequency-domain, and nonlinear analyses of ECG signals are used to find emotion-relevant features and to correlate them with emotional states. Subsequently, we develop a sequential forward floating selection-kernel-based class separability-based (SFFS-KBCS-based) feature selection algorithm and utilize the generalized discriminant analysis (GDA) to effectively select significant ECG features associated with emotions and to reduce the dimensions of the selected features, respectively. Positive/negative valence, high/low arousal, and four types of emotions (joy, tension, sadness, and peacefulness) are recognized using least squares support vector machine (LS-SVM) recognizers. The results show that the correct classification rates for positive/negative valence, high/low arousal, and four types of emotion classification tasks are 82.78, 72.91, and 61.52 percent, respectively.
AB - This paper presents an automatic ECG-based emotion recognition algorithm for human emotion recognition. First, we adopt a musical induction method to induce participants' real emotional states and collect their ECG signals without any deliberate laboratory setting. Afterward, we develop an automatic ECG-based emotion recognition algorithm to recognize human emotions elicited by listening to music. Physiological ECG features extracted from the time-, and frequency-domain, and nonlinear analyses of ECG signals are used to find emotion-relevant features and to correlate them with emotional states. Subsequently, we develop a sequential forward floating selection-kernel-based class separability-based (SFFS-KBCS-based) feature selection algorithm and utilize the generalized discriminant analysis (GDA) to effectively select significant ECG features associated with emotions and to reduce the dimensions of the selected features, respectively. Positive/negative valence, high/low arousal, and four types of emotions (joy, tension, sadness, and peacefulness) are recognized using least squares support vector machine (LS-SVM) recognizers. The results show that the correct classification rates for positive/negative valence, high/low arousal, and four types of emotion classification tasks are 82.78, 72.91, and 61.52 percent, respectively.
UR - http://www.scopus.com/inward/record.url?scp=85039791780&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85039791780&partnerID=8YFLogxK
U2 - 10.1109/TAFFC.2017.2781732
DO - 10.1109/TAFFC.2017.2781732
M3 - Article
AN - SCOPUS:85039791780
SN - 1949-3045
VL - 11
SP - 85
EP - 99
JO - IEEE Transactions on Affective Computing
JF - IEEE Transactions on Affective Computing
IS - 1
M1 - 8219396
ER -