Scientific publications
Read about the research that supports the FaceReader Ecosystem
Over the past 20+ years, our facial coding platform and its embedded technologies have been the subject as well as the preferred instrument for numerous accredited scientific studies. Below we present a comprehensive overview of the literature that has emerged from these studies, highlighting and validating the cutting-edge technology of FaceReader Online.
2017
14 citations
Does the ability to express different emotions predict different indices of physical health ? A skill-based study of physical symptoms and heart rate variability
Tuck, Adams, Consedine
The study investigated whether the ability to regulate emotional expressions predicts physical health indices. A cross-sectional study with 117 adults assessed participants’ skills in enhancing and suppressing expressions of amusement, sadness, and anger. Findings revealed that greater ability to enhance sad expressions was associated with higher heart rate variability , while the ability to enhance expressions of joy correlated with lower symptom interference. Additionally, the capacity to flexibly regulate expressions of joy and sadness was linked to reduced symptom interference. These results suggest that expressive regulatory skills are relevant to health and may offer novel avenues for research and intervention.
2017
6 citations
Memory Effect in Expressed Emotions During Long Term Group Interactions
Gorbunov, R. , Barakova, E., Rauterberg, M.
Long-term group interactions can be assessed through games where participants choose between cooperative or egoistic strategies. This study analyzed facial expressions of astronauts during repeated game interactions in the Mars-500 isolation experiment. Using FaceReader software, the researchers identified a memory effect in collective emotional expressions between experiments separated by two weeks. This suggests the potential to predict the development of interpersonal relationships in isolated groups, informing the design of long-term interaction behaviors in artificial agents.
2017
5 citations
Multimodal Observation and Interpretation of Subjects Engaged in Problem Solving
Guntz, Balzarini, Vaufreydaz, Crowley
In this paper, the authors present initial findings from a pilot experiment aimed at capturing and interpreting multimodal signals from human experts solving challenging chess problems. The study investigates how observations of eye-gaze, posture, emotion, and other physiological signals can model the cognitive state of subjects. It also explores integrating multiple sensor modalities to enhance the reliability of detecting human displays of awareness and emotion. Chess players were observed tackling problems of increasing difficulty while their behavior was recorded. These recordings help estimate a participant’s situational awareness and predict their ability to respond effectively to challenging situations. The results indicate that a multimodal approach is more accurate than a unimodal one. By combining body posture, visual attention, and emotion, the multimodal approach achieved up to 93% accuracy in determining a player’s chess expertise, compared to 86% with a unimodal approach. The experiment also validates the use of the equipment as a general and reproducible tool for studying participants engaged in screen-based interaction and problem-solving.
2017
72 citations
Predictably Angry—Facial Cues Provide a Credible Signal of Destructive Behavior
van Leeuwen, Noussair, Offerman, Suetens, van Veelen, van de Ven
Evolutionary explanations of anger as a commitment device hinge on two key assumptions. The first is that it is predictable, ex ante, whether someone will get angry when feeling that he or she has been badly treated. The second is that anger is associated with destructive behavior. We test the validity of these two assumptions. We collected photos of responders in an ultimatum game before they were informed about the game that they would be playing, and we filmed responders with webcams during play. We then showed pairs of photos consisting of one responder who rejected and one responder who accepted to an independent group of observers. We find that observers are better than chance at detecting who rejected the low offer; they do 10% better than random guessing would. We also find that anger at receiving a low offer is associated with rejection.
2016
38 citations
It’s a two-way street: Automatic and controlled processes in children’s emotional responses to moral transgressions
Dys & Malti
This study examined children’s automatic, spontaneous emotional reactions to everyday moral transgressions and their relations with self-reported emotions, which are more complex and infused with controlled cognition. We presented children with six everyday moral transgression scenarios in an experimental setting, and both their spontaneous facial emotional reactions and self-reported emotions in the role of the transgressor were recorded. We found that across age, self-reported guilt was positively associated with spontaneous fear, and self-reported anger was positively related to spontaneous sadness. In addition, we found a developmental increase in spontaneous sadness and decrease in spontaneous happiness. These results support the importance of automatic and controlled processes in evoking children’s emotional responses to everyday moral transgressions. We conclude by providing potential explanations for how automatic and controlled processes function in children’s everyday moral experiences and how these processes may change with age.
2016
30 citations
More emotional facial expressions during episodic than during semantic autobiographical retrieval
El Haj, Antoine, Nandrino
There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions . Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.