DESCRIPTION (Applicant's Abstract): Facial expression communicates information about emotional response and plays a critical role in the regulation of interpersonal behavior. Current human-observer based methods for measuring facial expression are labor intensive, qualitative, and difficult to standardize across laboratories and over time. To make feasible more rigorous, quantitative measurement of facial expression in diverse applications, we formed an interdisciplinary research group which covers expertise in facial expression analysis and image processing. In the funding period, we developed and demonstrated the first version of an automated system for measuring facial expression in digitized images. The system can discriminate nine combinations of FACS action units in the upper and lower face, quantity the timing and topography of action unit intensity in the brow region; and geometrically normalize image sequences within a range of plus or minus 20 degrees of out of-plane. In the competing renewal, we will increase the number of action unit combinations that are recognized, implement convergent methods of quantifying action unit intensity, increase the generalizability of action unit estimation to a wider range of image orientations, test facial image processing (FIP) in image sequences from directed facial action tasks and laboratory studies of emotion regulation, and facilitate the integration of FIP into existing data management and statistical analysis software for use by behavioral science researchers and clinicians. With these goals completed, FIP will eliminate the need for human observers in coding facial expression, promote standardize measurement, make possible the collection and processing of larger, more representative data sets, and open new areas of investigation and clinical application.