Scientific publications
Read about the research that supports the FaceReader Ecosystem
Over the past 20+ years, our facial coding platform and its embedded technologies have been the subject as well as the preferred instrument for numerous accredited scientific studies. Below we present a comprehensive overview of the literature that has emerged from these studies, highlighting and validating the cutting-edge technology of FaceReader Online.
2018
44 citations
The effects of robot facial emotional expressions and gender on child–robot interaction in a field study
Cameron, Millings, Fernando, Collins, Moore, Sharkey, Evers, Prescott
Emotions and emotional expression significantly influence social interactions, making them crucial in the development of social robots. This study investigated how life-like affective facial expressions in the humanoid robot Zeno affect children’s behavior and attitudes toward the robot. Findings reveal that robot expressions have varying effects based on participant gender. Male participants interacting with a responsive, facially expressive robot exhibited positive affective responses and reported greater liking toward the robot compared to those interacting with the same robot maintaining a neutral expression. Female participants showed no significant differences across conditions. The study discusses broader implications regarding gender differences in human–robot interaction, emphasizing the importance of the robot’s gender appearance and advancing the understanding of how interactions with expressive robots could lead to task-appropriate symbiotic relationships.
2018
35 citations
Conveying facial expressions to blind and visually impaired persons through a wearable vibrotactile device
H.P. Buimer, M. Bittner, T. Kostelijk, T.M. van der Geest, A. Nemri, R. J. A. van Wezel and Y. Zhao
In face-to-face social interactions, blind and visually impaired persons lack access to nonverbal cues like facial expressions, body posture, and gestures, which may lead to impaired interpersonal communication. In this study, a wearable sensory substitution device consisting of a head-mounted camera and a haptic belt was evaluated to determine whether vibrotactile cues around the waist could be used to convey facial expressions to users and whether such a device is desired by VIPs for use in daily living situations. Ten VIPs and 10 sighted persons participated in the study, in which validated sets of pictures, silent videos, and videos with audio of facial expressions were presented to the participant. A control measurement was first performed to determine how accurately participants could identify facial expressions while relying on their functional senses. After a short training, participants were asked to determine facial expressions while wearing the emotion feedback system. VIPs using the device showed significant improvements in their ability to determine which facial expressions were shown. A significant increase in accuracy of 44.4% was found across all types of stimuli when comparing the scores of the control and supported phases. The greatest improvements achieved with the support of the SSD were found for silent stimuli . SPs also showed consistent, though not statistically significant, improvements while supported. Overall, our study shows that vibrotactile cues are well suited to convey facial expressions to VIPs in real-time. Participants became skilled with the device after a short training session. Further testing and development of the SSD is required to improve its accuracy and aesthetics for potential daily use.
2018
11 citations
Multimodal Observation and Classification of People Engaged in Problem Solving : Application to Chess Players
Guntz, Balzarini, Vaufreydaz, Crowley
In this paper, the authors present initial results from a pilot experiment aimed at interpreting multimodal observations of human experts engaged in solving challenging chess problems. The study investigates how observations of eye-gaze, posture, emotion, and other physiological signals can be utilized to model the cognitive state of subjects. It also explores the integration of multiple sensor modalities to enhance the reliability of detecting human displays of awareness and emotion. Potential applications for such cognitive model-based systems include promoting healthy autonomous aging and developing automated training systems. By observing chess players tackling problems of increasing difficulty and recording their behavior, the researchers aim to estimate participants’ situational awareness and predict their ability to respond effectively to challenging situations. Feature selection was performed to construct a multimodal classifier that relies on the most relevant features from each modality. Initial results indicate that eye-gaze, body posture, and emotion are effective features for capturing such awareness. This experiment also validates the use of the equipment as a general and reproducible tool for studying participants engaged in screen-based interaction and problem-solving.
2017
16 citations
Positive facial expressions during retrieval of self-defining memories
Gandolphe, Nandrino, Delelis, Ducro, Lavallee, Saloppe, Moustafa, El Haj
This study investigates facial expressions during the retrieval of self-defining memories—vivid and emotionally intense memories of enduring concerns or unresolved conflicts. Participants self-rated the emotional valence of their self-defining memories, while autobiographical retrieval was analyzed using facial analysis software . This software synthesizes facial expression information to categorize expressions as neutral, happy, sad, surprised, angry, scared, or disgusted. Findings revealed that participants exhibited more emotional than neutral facial expressions during the retrieval of self-defining memories, with a predominance of positive over negative expressions. Interestingly, participants attributed positive valence to the retrieved memories. These findings demonstrate a consistency between facial expressions and the subjective emotional experience of self-defining memories, providing valuable physiological insights into the emotional experience of the past.
2017
66 citations
The Effect of Parental Modeling on Child Pain Responses: The Role of Parent and Child Sex
Boerner, Christine, McGrath, LoLordo, Uher
Social modeling is a process by which pain behaviors are learned, and research has found parents act as models for their children’s behavior. Despite social learning theory predicting that same-sex models have greater effect, no experimental investigation to date has examined the role of sex of the model or observer in social learning of pediatric pain. The present study recruited 168 parent-child dyads in which children were generally healthy and 6 to 8 years old. Unbeknownst to their child, parents were randomly assigned to exaggerate their expression of pain, minimize their expression of pain, or act naturally during the cold pressor task . Parents completed the CPT while their child observed, then children completed the CPT themselves. Children whose parents were in the exaggerate condition reported higher anxiety than children of parents in the minimize condition. Additionally, girls in the exaggerate condition rated their overall pain intensity during the CPT significantly higher than boys in the same condition. No child sex differences were observed in pain intensity for the control or minimize conditions. Parent expressions of pain affects children’s anxiety, and sex-specific effects of parental exaggerated pain expression on children’s own subsequent pain experience are present.
Perspective: This article describes how parental expressions of pain influence children’s pain and anxiety, specifically examining the relevance of parent and child sex in this process. These findings have implications for children of parents with chronic pain, or situations in which parents experience pain in the presence of their child .
2017
27 citations
Sensory-specific satiety: Added insights from autonomic nervous system responses and facial expressions
He, Boesveldt, Delplanque, de Graaf, de Wijk
As a food is consumed, its perceived pleasantness declines compared to that of other foods. Although this phenomenon, referred to as sensory-specific satiety, is well-established by means of measuring food intake and pleasantness ratings, this study was aimed at gaining more insight into the mechanisms that underlie such cognitive output behavior using two measures used in emotion research, namely Autonomic Nervous System responses and facial expressions. Twenty-four healthy female participants visited four times in a hungry state, in which they received 4 different semi-liquid meals delivered via a time-controlled pump leading to sensory-specific satiety. Before and after the meals they were presented with a sip of all four different test meals where ANS responses and facial expressions were recorded. As expected, pleasantness ratings showed a significant decrease after eating the same meal or a meal similar in taste , and less decrease after eating a meal with a different taste. In general, consumption of the test meals resulted in increased heart rate, reduced skin conductance and skin temperature, as well as intensified anger and disgusted facial expressions . In addition, skin conductance, skin temperature, sad and angry expressions also showed effects reflecting sensory-specific satiety. In conclusion, ANS responses and facial expressions indicate that sensory specific satiety of foods 1) not only reduces the food’s pleasantness but also arousal and 2) are possibly mediated by changes in food emotions.