Automatic ECG-Based Emotion Recognition in Music Listening

Yu Liang Hsu, Jeen Shing Wang, Wei Chun Chiang, Chien Han Hung

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)


This paper presents an automatic ECG-based emotion recognition algorithm for human emotion recognition. First, we adopt a musical induction method to induce participants' real emotional states and collect their ECG signals without any deliberate laboratory setting. Afterward, we develop an automatic ECG-based emotion recognition algorithm to recognize human emotions elicited by listening to music. Physiological ECG features extracted from the time-, and frequency-domain, and nonlinear analyses of ECG signals are used to find emotion-relevant features and to correlate them with emotional states. Subsequently, we develop a sequential forward floating selection-kernel-based class separability-based (SFFS-KBCS-based) feature selection algorithm and utilize the generalized discriminant analysis (GDA) to effectively select significant ECG features associated with emotions and to reduce the dimensions of the selected features, respectively. Positive/negative valence, high/low arousal, and four types of emotions (joy, tension, sadness, and peacefulness) are recognized using least squares support vector machine (LS-SVM) recognizers. The results show that the correct classification rates for positive/negative valence, high/low arousal, and four types of emotion classification tasks are 82.78, 72.91, and 61.52 percent, respectively.

Original languageEnglish
Article number8219396
Pages (from-to)85-99
Number of pages15
JournalIEEE Transactions on Affective Computing
Issue number1
Publication statusPublished - 2020 Jan 1

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Automatic ECG-Based Emotion Recognition in Music Listening'. Together they form a unique fingerprint.

Cite this