Detection, tracking, and classification of action units in facial expression

James Jenn Jier Lien, Takeo Kanade, Jeffrey F. Cohn, Ching Chung Li

Research output: Contribution to journalArticlepeer-review

207 Citations (Scopus)


Most of the current work on automated facial expression analysis attempt to recognize a small set of prototypic expressions, such as joy and fear. Such prototypic expressions, however, occur infrequently, and human emotions and intentions are communicated more often by changes in one or two discrete features. To capture the full range of facial expression, detection, tracking, and classification of fine-grained changes in facial features are needed. We developed the first version of a computer vision system that is sensitive to subtle changes in the face. The system includes three modules to extract feature information: dense-flow extraction using a wavelet motion model, facial-feature tracking, and edge and line extraction. The feature information thus extracted is fed to discriminant classifiers or hidden Markov models that classify it into FACS action units, the descriptive system to code fine-grained changes in facial expression. The system was tested on image sequences from 100 male and female subjects of varied ethnicity. Agreement with manual FACS coding was strong for the results based on dense-flow extraction and facial-feature tracking, and strong to moderate for edge and line extraction.

Original languageEnglish
Pages (from-to)131-146
Number of pages16
JournalRobotics and Autonomous Systems
Issue number3
Publication statusPublished - 2000 May 31

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Mathematics(all)
  • Computer Science Applications


Dive into the research topics of 'Detection, tracking, and classification of action units in facial expression'. Together they form a unique fingerprint.

Cite this