Infant AFAR: Automated facial action recognition in infants

Itir Önal Ertuğrul*, Yeojin Amy Ahn, Maneesh Bilalpur, Daniel S. Messinger, Matthew L. Speltz, Jeffrey F. Cohn

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Automated detection of facial action units in infants is challenging. Infant faces have different proportions, less texture, fewer wrinkles and furrows, and unique facial actions relative to adults. For these and related reasons, action unit (AU) detectors that are trained on adult faces may generalize poorly to infant faces. To train and test AU detectors for infant faces, we trained convolutional neural networks (CNN) in adult video databases and fine-tuned these networks in two large, manually annotated, infant video databases that differ in context, head pose, illumination, video resolution, and infant age. AUs were those central to expression of positive and negative emotion. AU detectors trained in infants greatly outperformed ones trained previously in adults. Training AU detectors across infant databases afforded greater robustness to between-database differences than did training database specific AU detectors and outperformed previous state-of-the-art in infant AU detection. The resulting AU detection system, which we refer to as Infant AFAR (Automated Facial Action Recognition), is available to the research community for further testing and applications in infant emotion, social interaction, and related topics.
Original languageEnglish
Pages (from-to)1024-1035
JournalBehavior Research Methods
Volume55
Early online date10 May 2022
DOIs
Publication statusPublished - Apr 2023

Fingerprint

Dive into the research topics of 'Infant AFAR: Automated facial action recognition in infants'. Together they form a unique fingerprint.

Cite this