The image features of emotional faces that predict the initial eye movement to a face

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is that studies show inconsistent results concerning such automatic reactions, particularly when using different face stimuli. We argue that automatic reactions to facial expressions can be better explained, and better understood, in terms of quantitative descriptions of their low-level image features rather than in terms of the emotional content (e.g. angry) of the expressions. Here, we focused on overall spatial frequency (SF) and localized Histograms of Oriented Gradients (HOG) features. We used machine learning classification to reveal the SF and HOG features that are sufficient for classification of the initial eye movement towards one out of two simultaneously presented faces. Interestingly, the identified features serve as better predictors than the emotional content of the expressions. We therefore propose that our modelling approach can further specify which visual features drive these and other behavioural effects related to emotional expressions, which can help solve the inconsistencies found in this line of research.
Original languageEnglish
Article number8287
Number of pages14
JournalScientific Reports
Volume11
Issue number1
DOIs
Publication statusPublished - Dec 2021

Keywords

  • Emotion
  • Machine Learning
  • Vision

Fingerprint

Dive into the research topics of 'The image features of emotional faces that predict the initial eye movement to a face'. Together they form a unique fingerprint.

Cite this