Browsing

Publication Tag: Cognitive Science

An overview of all publications that have the tag you selected.

2017
72 citations
Predictably Angry—Facial Cues Provide a Credible Signal of Destructive Behavior
van Leeuwen, Noussair, Offerman, Suetens, van Veelen, van de Ven
Evolutionary explanations of anger as a commitment device hinge on two key assumptions. The first is that it is predictable, ex ante, whether someone will get angry when feeling that he or she has been badly treated. The second is that anger is associated with destructive behavior. We test the validity of these two assumptions. We collected photos of responders in an ultimatum game before they were informed about the game that they would be playing, and we filmed responders with webcams during play. We then showed pairs of photos consisting of one responder who rejected and one responder who accepted to an independent group of observers. We find that observers are better than chance at detecting who rejected the low offer; they do 10% better than random guessing would. We also find that anger at receiving a low offer is associated with rejection.
2014
132 citations
Risk Aversion and Emotions
Nguyen & Noussair
We consider the relationship between emotions and decision-making under risk. Specifically, we examine the emotional correlates of risk-averse decisions. In our experiment, individuals’ facial expressions are monitored with face reading software, as they are presented with risky lotteries. We then correlate these facial expressions with subsequent decisions in risky choice tasks. We find that a more positive emotional state is positively correlated with greater risk taking. The strength of a number of emotions, including fear, happiness, anger and surprise, is positively correlated with risk aversion.
2012
106 citations
Affective learning: Empathetic agents with emotional facial and tone of voice expressions
Moridis & Economides
Empathetic behavior is considered an effective method for Embodied Conversational Agents to provide feedback to learners’ emotions. This study examines the impact of ECAs’ emotional facial and tone of voice expressions, combined with empathetic verbal behavior, when displayed as feedback to students’ fear, sadness, and happiness during a self-assessment test. Three identical female agents were used: 1. An ECA performing parallel empathy with neutral emotional expressions. 2. An ECA performing parallel empathy displaying emotional expressions relevant to the student’s emotional state. 3. An ECA performing parallel empathy by displaying relevant emotional expressions followed by reactive empathy expressions aimed at altering the student’s emotional state. Results indicate that an agent performing parallel empathy with emotional expressions relevant to the student’s state may cause the emotion to persist. Moreover, the agent performing both parallel and reactive empathy effectively altered a fearful emotional state to a neutral one.
2017
20 citations
Clusters of Nonverbal Behaviors Differ According to Type of Question and Veracity in Investigative Interviews in a Mock Crime Context
Matsumoto & Hwang
Evaluating truthfulness and detecting deception is a capstone skill of criminal justice professionals, and researchers have long examined nonverbal cues to aid in such determinations. This paper examines the notion that testing clusters of nonverbal behaviors is a more fruitful way of making such determinations than single, specific behaviors. Participants from four ethnic groups participated in a mock crime and either told the truth or lied in an investigative interview. Fourteen nonverbal behaviors of the interviewees were coded from the interviews; differences in the behaviors were tested according to type of question and veracity condition. Different types of questions produced different nonverbal reactions. Clusters of nonverbal behaviors differentiated truth tellers from liars, and the specific clusters were moderated by question. Accuracy rates ranged from 62.6 to 72.5% and were above deception detection accuracy rates for humans and random data. These findings have implications for practitioners as well as future research and theory.
2014
96 citations
Dynamics of autonomic nervous system responses and facial expressions to odors
He, Boesveldt, de Graaf, de Wijk
This study investigates the temporal dynamics of autonomic nervous system responses and facial expressions elicited by olfactory stimuli. Participants were exposed to pleasant and unpleasant odors while their ANS responses and facial expressions were recorded. Results indicated that unpleasant odors triggered immediate ANS activation and facial expressions of disgust, whereas pleasant odors elicited more gradual ANS responses and facial expressions of pleasure. These findings suggest that the human body reacts more swiftly and intensely to negative olfactory stimuli, highlighting the adaptive significance of rapid detection and response to potentially harmful substances. The study provides insights into the interplay between olfactory perception, emotional processing, and physiological reactions.
2018
11 citations
Multimodal Observation and Classification of People Engaged in Problem Solving : Application to Chess Players
Guntz, Balzarini, Vaufreydaz, Crowley
In this paper, the authors present initial results from a pilot experiment aimed at interpreting multimodal observations of human experts engaged in solving challenging chess problems. The study investigates how observations of eye-gaze, posture, emotion, and other physiological signals can be utilized to model the cognitive state of subjects. It also explores the integration of multiple sensor modalities to enhance the reliability of detecting human displays of awareness and emotion. Potential applications for such cognitive model-based systems include promoting healthy autonomous aging and developing automated training systems. By observing chess players tackling problems of increasing difficulty and recording their behavior, the researchers aim to estimate participants’ situational awareness and predict their ability to respond effectively to challenging situations. Feature selection was performed to construct a multimodal classifier that relies on the most relevant features from each modality. Initial results indicate that eye-gaze, body posture, and emotion are effective features for capturing such awareness. This experiment also validates the use of the equipment as a general and reproducible tool for studying participants engaged in screen-based interaction and problem-solving.
2017
5 citations
Multimodal Observation and Interpretation of Subjects Engaged in Problem Solving
Guntz, Balzarini, Vaufreydaz, Crowley
In this paper, the authors present initial findings from a pilot experiment aimed at capturing and interpreting multimodal signals from human experts solving challenging chess problems. The study investigates how observations of eye-gaze, posture, emotion, and other physiological signals can model the cognitive state of subjects. It also explores integrating multiple sensor modalities to enhance the reliability of detecting human displays of awareness and emotion. Chess players were observed tackling problems of increasing difficulty while their behavior was recorded. These recordings help estimate a participant’s situational awareness and predict their ability to respond effectively to challenging situations. The results indicate that a multimodal approach is more accurate than a unimodal one. By combining body posture, visual attention, and emotion, the multimodal approach achieved up to 93% accuracy in determining a player’s chess expertise, compared to 86% with a unimodal approach. The experiment also validates the use of the equipment as a general and reproducible tool for studying participants engaged in screen-based interaction and problem-solving.
2017
6 citations
Memory Effect in Expressed Emotions During Long Term Group Interactions
Gorbunov, R. , Barakova, E., Rauterberg, M.
Long-term group interactions can be assessed through games where participants choose between cooperative or egoistic strategies. This study analyzed facial expressions of astronauts during repeated game interactions in the Mars-500 isolation experiment. Using FaceReader software, the researchers identified a memory effect in collective emotional expressions between experiments separated by two weeks. This suggests the potential to predict the development of interpersonal relationships in isolated groups, informing the design of long-term interaction behaviors in artificial agents.
2015
16 citations
Deceit and facial expression in children: the enabling role of the “poker face” child and the dependent personality of the detector.
Gadea, Alino, Espert, Salvador
This study examines the interplay between children’s deceptive behaviors and their facial expressions, focusing on the “poker face”—a neutral expression that conceals emotions. It also explores how the personality traits of the observer, particularly dependency, influence the detection of deceit. The research suggests that children who can maintain a poker face are more successful in deceiving others. Additionally, observers with dependent personalities may be less adept at identifying deceit, potentially due to their reliance on others and desire for approval. These findings highlight the complex dynamics between a child’s ability to mask emotions and the observer’s personality in the context of deception.
2016
30 citations
More emotional facial expressions during episodic than during semantic autobiographical retrieval
El Haj, Antoine, Nandrino
There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions . Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.

Request a free trial

Get your free example report

Get your free whitepaper