Emotion-Specific Facial Activation Maps Based on Infrared Thermal Image Sequences

Bo Lin Jian, Chieh Li Chen, Min Wei Huang, Her Terng Yau

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)

Abstract

Research on emotion recognition has recently started to gain increased attention. Inner emotions or thought activity can be determined by analyzing facial expression, behavioral responses, audio, and physiological signals. Facial expression is now recognized as an important form of non-verbal interaction. In this paper, emotion-specific activation maps were constructed to establish infrared thermal facial image sequences as an alternative approach to the determination of the correlation between emotional triggers and changes in facial temperature. During the testing process, data stored in the International Affective Picture System were used to create emotional clips that triggered three different types of emotion in the subjects, and their infrared thermal facial image sequences were simultaneously recorded. For processing, an image calibration protocol was first employed to reduce the variance produced by irregular micro-shifts in the faces of the subjects, followed by independent component analysis and statistical analysis protocols to create the facial emotional activation maps. The test results showed that the problem of selecting local regions when analyzing frame temperature had been resolved. The emotion-specific facial activation maps provide visualized results that facilitate the observation and understanding of information.

Original languageEnglish
Article number8679994
Pages (from-to)48046-48052
Number of pages7
JournalIEEE Access
Volume7
DOIs
Publication statusPublished - 2019

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)

Fingerprint

Dive into the research topics of 'Emotion-Specific Facial Activation Maps Based on Infrared Thermal Image Sequences'. Together they form a unique fingerprint.

Cite this