Current approaches to automated analysis have focused an a small set of prototypic expressions (e.g. joy or anger). Prototypic expressions occur infrequently in everyday life, however, and emotion expression is far more varied. To capture the full range of emotion expression, automated discrimination of fine grained changes in facial expression is needed. We developed and implemented an optical flow based approach (feature point tracking) that is sensitive to subtle changes in facial expression. In image sequences from 100 young adults, action units and action unit combinations in the brow and mouth regions were selected for analysis if they occurred a minimum of 25 times in the image database. Selected facial features were automatically tracked using a hierarchical algorithm for estimating optical flow. Image sequences were randomly divided into training and test sets. Feature point tracking demonstrated high concurrent validity with human coding using the Facial Action Coding System (FACS).