Scientific publications
Read about the research that supports the FaceReader Ecosystem
Over the past 20+ years, our facial coding platform and its embedded technologies have been the subject as well as the preferred instrument for numerous accredited scientific studies. Below we present a comprehensive overview of the literature that has emerged from these studies, highlighting and validating the cutting-edge technology of FaceReader Online.
2016
30 citations
More emotional facial expressions during episodic than during semantic autobiographical retrieval
El Haj, Antoine, Nandrino
There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions . Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.
2016
56 citations
What facial appearance reveals over time: When perceived expressions in neutral faces reveal stable emotion dispositions
Adams, Garrido, Albohn, Hess, Kleck
This study investigates whether neutral facial expressions can reveal individuals’ stable emotional dispositions, particularly focusing on age and gender differences. Through a series of experiments, the researchers found that perceived expressions in neutral faces predicted self-reported positive affect, but this effect was significant only for elderly women. The findings suggest that age-related facial changes may convey information about a person’s emotional disposition, with gender differences potentially influencing these perceptions. The study highlights the complex interplay between facial appearance, age, gender, and perceived emotional states.
2015
30 citations
Association between facial expression and PTSD symptoms among young children exposed to the Great East Japan Earthquake: a pilot study
Fujiwara
“Emotional numbing” is a symptom of post-traumatic stress disorder characterized by a loss of interest in usually enjoyable activities, feeling detached from others, and an inability to express a full range of emotions. Emotional numbing is usually assessed through self-report, and is particularly difficult to ascertain among young children. We conducted a pilot study to explore the use of facial expression ratings in response to a comedy video clip to assess emotional reactivity among preschool children directly exposed to the Great East Japan Earthquake. This study included 23 child participants. Child PTSD symptoms were measured using a modified version of the Parent’s Report of the Child’s Reaction to Stress scale. Children were filmed while watching a 2-min video compilation of natural scenes followed by a 2-min video clip from a television comedy . Children’s facial expressions were processed the using Noldus FaceReader software, which implements the Facial Action Coding System . We investigated the association between PTSD symptom scores and facial emotion reactivity using linear regression analysis. Children with higher PTSD symptom scores showed a significantly greater proportion of neutral facial expressions, controlling for sex, age, and baseline facial expression . This pilot study suggests that facial emotion reactivity, measured using facial expression recognition software, has the potential to index emotional numbing in young children. This pilot study adds to the emerging literature on using experimental psychopathology methods to characterize children’s reactions to disasters.
2015
16 citations
Deceit and facial expression in children: the enabling role of the “poker face” child and the dependent personality of the detector.
Gadea, Alino, Espert, Salvador
This study examines the interplay between children’s deceptive behaviors and their facial expressions, focusing on the “poker face”—a neutral expression that conceals emotions. It also explores how the personality traits of the observer, particularly dependency, influence the detection of deceit. The research suggests that children who can maintain a poker face are more successful in deceiving others. Additionally, observers with dependent personalities may be less adept at identifying deceit, potentially due to their reliance on others and desire for approval. These findings highlight the complex dynamics between a child’s ability to mask emotions and the observer’s personality in the context of deception.
2015
216 citations
Deep learning based FACS Action Unit occurrence and intensity estimation
A. Gudi, H. E. Tasli, T. M. den Uyl and A. Maroulis
Ground truth annotation of the occurrence and intensity of FACS Action Unit activation requires great amount of attention. The efforts towards achieving a common platform for AU evaluation have been addressed in the FG 2015 Facial Expression Recognition and Analysis challenge . Participants are invited to estimate AU occurrence and intensity on a common benchmark dataset. Conventional approaches towards achieving automated methods are to train multiclass classifiers or to use regression models. In this paper, we propose a novel application of a deep convolutional neural network to recognize AUs as part of FERA 2015 challenge. The 7 layer network is composed of 3 convolutional layers and a max-pooling layer. The final fully connected layers provide the classification output. For the selected tasks of the challenge, we have trained two different networks for the two different datasets, where one focuses on the AU occurrences and the other on both occurrences and intensities of the AUs. The occurrence and intensity of AU activation are estimated using specific neuron activations of the output layer. This way, we are able to create a single network architecture that could simultaneously be trained to produce binary and continuous classification output.
2015
52 citations
Don’t look blank, happy, or sad: Patterns of facial expressions of speakers in banks’ YouTube videos predict video’s popularity over time
P. Lewinski
There has been little focus on nonverbal communication in social media advertising campaigns. We propose that specific patterns of facial expressions predict the popularity of YouTube videos among users of social media. To test that proposition, we used a neuromarketing tool—FaceReader—to code facial videos of professional speakers who participated in the YouTube social media campaigns of 2 large commercial banks. We analyzed more than 25,000 video frames of 16 speakers’ 6 basic facial expressions. We found that less incidence of affiliative facial emotions and more incidence of nonemotional expressions explained an additional 25% of variance in the video’s popularity after 8 months in t2 , in comparison to t1 as the only baseline predictor. We further showed that the disaffiliative facial emotions of the speakers did not contribute as an indicator of the future performance of social media content. We hope that these findings will open new lines of research in corporate communication by incorporating neuromarketing and nonverbal communication to understand not only what content is effective but how it should be presented.