Scientific publications

Read about the research that supports the FaceReader Ecosystem

Over the past 20+ years, our facial coding platform and its embedded technologies have been the subject as well as the preferred instrument for numerous accredited scientific studies. Below we present a comprehensive overview of the literature that has emerged from these studies, highlighting and validating the cutting-edge technology of FaceReader Online.
2015
16 citations
Deceit and facial expression in children: the enabling role of the “poker face” child and the dependent personality of the detector.
Gadea, Alino, Espert, Salvador
This study examines the interplay between children’s deceptive behaviors and their facial expressions, focusing on the “poker face”—a neutral expression that conceals emotions. It also explores how the personality traits of the observer, particularly dependency, influence the detection of deceit. The research suggests that children who can maintain a poker face are more successful in deceiving others. Additionally, observers with dependent personalities may be less adept at identifying deceit, potentially due to their reliance on others and desire for approval. These findings highlight the complex dynamics between a child’s ability to mask emotions and the observer’s personality in the context of deception.
2015
216 citations
Deep learning based FACS Action Unit occurrence and intensity estimation
A. Gudi, H. E. Tasli, T. M. den Uyl and A. Maroulis
Ground truth annotation of the occurrence and intensity of FACS Action Unit activation requires great amount of attention. The efforts towards achieving a common platform for AU evaluation have been addressed in the FG 2015 Facial Expression Recognition and Analysis challenge . Participants are invited to estimate AU occurrence and intensity on a common benchmark dataset. Conventional approaches towards achieving automated methods are to train multiclass classifiers or to use regression models. In this paper, we propose a novel application of a deep convolutional neural network to recognize AUs as part of FERA 2015 challenge. The 7 layer network is composed of 3 convolutional layers and a max-pooling layer. The final fully connected layers provide the classification output. For the selected tasks of the challenge, we have trained two different networks for the two different datasets, where one focuses on the AU occurrences and the other on both occurrences and intensities of the AUs. The occurrence and intensity of AU activation are estimated using specific neuron activations of the output layer. This way, we are able to create a single network architecture that could simultaneously be trained to produce binary and continuous classification output.
2015
52 citations
Don’t look blank, happy, or sad: Patterns of facial expressions of speakers in banks’ YouTube videos predict video’s popularity over time
P. Lewinski
There has been little focus on nonverbal communication in social media advertising campaigns. We propose that specific patterns of facial expressions predict the popularity of YouTube videos among users of social media. To test that proposition, we used a neuromarketing tool—FaceReader—to code facial videos of professional speakers who participated in the YouTube social media campaigns of 2 large commercial banks. We analyzed more than 25,000 video frames of 16 speakers’ 6 basic facial expressions. We found that less incidence of affiliative facial emotions and more incidence of nonemotional expressions explained an additional 25% of variance in the video’s popularity after 8 months in t2 , in comparison to t1 as the only baseline predictor. We further showed that the disaffiliative facial emotions of the speakers did not contribute as an indicator of the future performance of social media content. We hope that these findings will open new lines of research in corporate communication by incorporating neuromarketing and nonverbal communication to understand not only what content is effective but how it should be presented.
2015
72 citations
The relation between continuous and discrete emotional responses to food odors with facial expressions and non-verbal reports
He, Boesveldt, de Graaf, de Wijk
Traditional sensory and hedonic tests are often limited to predict market performance. Investigating emotional responses to food stimuli may contribute to a better understanding of consumers’ eating behavior. In the present study, 26 female participants were exposed to an orange (pleasant) and a fish (unpleasant) odor presented in three different concentrations perceived as weak, medium and strong intensity in a semi-random order via an olfactometer. Emotional responses to those food odors were measured discretely using non-verbal subjective reports, and continuously using facial expressions. Non-verbal reports reflected primarily the odor’s valence with positive emotions, such as joy, satisfaction and hope, related to orange and negative emotions, such as dissatisfaction, fear and disgust, related to fish. Facial expressions varied dynamically over the 4 s following stimulation, whereby expressions at 1250 and 2000 ms associated best with odor valence and odor intensity, respectively. The correlation between non-verbal subjective reports and facial expressions reached a maximum during the second sec after exposure. Pleasant odors were associated with neutral and surprised expressions, and with fewer expressions of disgust. More intense odors were associated with fewer neutral expressions and more expressions of disgust. Facial expressions reflect the dynamic sequential unfolding of different emotional responses, whereas non-verbal reports primarily reflect the end result of valence appraisal. The distinction between initial and subsequent reactions detected by facial expressions may offer a new valuable perspective for sensory and consumer research.
2015
11 citations
Consumers Economic Behavior and Emotions: the case of iphone 6 in Neuromarketing
Neto & Filipe
In the current era, consumers’ fascination for many notable brands in the market is rising considerably. This effect shows that many companies are trying to consistently reproduce this effect in their current customers and thus look for creating a strong identification with their brand, which allows the company to add new economic value. However, the vast majority of these companies find limitations in the way traditional marketing works to achieve a necessary emotional state for the generation of a brand identification. Thus, it is necessary to go further and use more effective tools for the study of consumer behavior. An interesting possibility is the use of Neuromarketing, emphasizing research methods for studying people’s emotional feelings by facing stimuli related to a specific brand. From this analysis, companies may adjust their commercial and economic strategy to take advantage of their brand’s competitive positioning in the market.
2015
56 citations
Presence of life-like robot expressions influences children’s enjoyment of human-robot interactions in the field
Cameron, Fernando, Collins, Milings, Moore, Sharkey, Evers, Prescott
Emotions and their expression significantly influence social interactions, making them crucial in the development of social robots. This study, part of a collaborative EU project, investigated how lifelike affective facial expressions in the humanoid robot Zeno affect children’s behaviors and attitudes. Findings revealed gender-based differences: male participants exhibited positive affective responses and a greater liking for the robot when it displayed both positive and negative facial expressions during an interactive game, compared to a neutral expression. Female participants showed no significant differences between conditions. This research is the first to demonstrate the impact of lifelike emotional expressions on children’s behavior in real-world settings. The study discusses broader implications, emphasizing gender differences in human-robot interaction and the importance of the robot’s gender appearance . It contributes to advancing the understanding of how interactions with expressive robots can foster task-appropriate symbiotic relationships.

Get your free example report

Get your free whitepaper