Exploring Macroscopic and Microscopic Fluctuation of Facial Expression for Mood Disorder Classification

  • 張 家誠

Student thesis: Master's Thesis


In clinical diagnosis of mood disorder a large portion of bipolar disorder patients (BDs) are misdiagnosed as unipolar depression (UDs) Clinicians have confirmed that BDs generally show “Reduced Affect” during clinical treatment Thus it is expected to build an objective and one-time diagnosis system for diagnosis assistance by using machine learning techniques In this thesis facial expressions of BD UD and control group elicited by emotional video clips are collected for exploring temporal fluctuation characteristics among the three groups The differences of facial expressions among mood disorders are investigated by observing macroscopic and microscopic fluctuations To deal with these problems the corresponding methods for feature extraction and modeling are proposed Finally decision level fusion is utilized by combining the results from multiple approaches to improve the classification performance From the viewpoint of macroscopic facial expression action unit (AU) is applied for describing the temporal transformation of muscles Then modulation spectrum is used for extracting short-term variation of AU The artificial neutral network (ANN) is then applied to characterize the interval-based mood disorder By using the geometric average and product rule intervals among different emotions and AUs are integrated to obtain the results On the other hand motion vector (MV) is employed for observing subtle changes in microscopic facial expression Eight basic orientation of motion vector changes is considered for describing micro fluctuation Then wavelet decomposition is applied to extract entropy and energy of different frequency bands Besides for dimensionality reduction an autoencoder neural network is adopted to extract the bottleneck features Finally in order to describe the long-term variation among different emotional elicitations the long short term memory (LSTM) is employed for modeling mood disorders For evaluation of the proposed method 12 subjects for each group (i e BD UD C) are included in the K-fold (K=12) cross validation experiments Macroscopic expression reached 61 1% classification accuracy and microscopic expression achieved 67 7% accuracy The proposed approach based on the fusion of both classification results obtained 72 2% accuracy which indicates that AU and MV descriptors are complementary to each other
Date of Award2016 Aug 17
Original languageEnglish
SupervisorChung-Hsien Wu (Supervisor)

Cite this