Browsing

Publication Tag: Human-Computer Interaction

An overview of all publications that have the tag you selected.

2006
113 citations
The FaceReader: Measuring instant fun of use
Zaman, Shrimpton-Smith
Recently, more and more attention has been paid to emotions in the domain of Human-Computer Interaction. When evaluating a product, one can no longer ignore the emotions a product induces. This paper examines the value of a new instrument to measure emotions: the FaceReader. We will assess the extent to which the FaceReader is useful when conducting usability evaluations. To do this, we will compare the data gained from the FaceReader with two other sources: user questionnaires and researcher’s loggings. Preliminary analysis shows that the FaceReader is an effective tool to measure instant emotions and fun of use. However, a combination of the FaceReader with another observation method is necessary. As regards the user questionnaire, our results indicate that it is rather a reflection of the content of the application or the outcome of a task, than a correct self-reflection of how the user felt when accomplishing the task.
2017
39 citations
The Emotional, Cognitive, Physiological, and Performance Effects of Variable Time Delay in Robotic Teleoperation
Yang & Dorneich
The study investigates the impact of intermittent and variable time delays on operators during robotic teleoperation tasks. Participants navigated a remote-controlled robot through mazes of varying complexity while identifying targets. Introducing feedback lag led to increased frustration, anger, and cognitive workload, while decreasing usability and task performance. The effects of variable time delay were more pronounced than those of task complexity, and their combined impact was additive. Understanding these emotional and physiological responses is crucial for designing robotic systems that can effectively mitigate negative operator states.
2007
36 citations
Unobtrusive Multimodal Emotion Detection in Adaptive Interfaces: Speech and Facial Expressions
Truong, van Leeuwen, Neerincx
Two unobtrusive modalities for automatic emotion recognition are discussed: speech and facial expressions. First, an overview is given of emotion recognition studies based on a combination of speech and facial expressions. We will identify difficulties concerning data collection, data fusion, system evaluation and emotion annotation that one is most likely to encounter in emotion recognition research. Further, we identify some of the possible applications for emotion recognition such as health monitoring or e-learning systems. Finally, we will discuss the growing need for developing agreed standards in automatic emotion recognition research.
2012
127 citations
The effect of emotional feedback on behavioral intention to use computer based assessment
Terzis, Moridis,Economides
This study introduces emotional feedback as a construct in an acceptance model, exploring its effect on behavioral intention to use Computer Based Assessment . A female Embodied Conversational Agent with empathetic encouragement behavior was displayed as emotional feedback. The research investigates the impact of Emotional Feedback on Behavioral Intention to Use a CBA system, Perceived Playfulness, Perceived Usefulness, Perceived Ease of Use, Content, and Facilitating Conditions. A survey questionnaire was completed by 134 students. Results demonstrate that Emotional Feedback has a direct effect on Behavioral Intention to Use a CBA system and on other crucial determinants of Behavioral Intention. The proposed acceptance model for computer-based assessment, extended with the Emotional Feedback variable, explains approximately 52% of the variance of Behavioral Intention.
2012
56 citations
UX_Mate: From Facial Expressions to UX Evaluation
Staiano, Menéndez, Battocchi, De Angeli, Sebe
In this paper, the authors propose and evaluate UX_Mate, a non-invasive system for the automatic assessment of User eXperience . Additionally, they contribute a novel database of annotated and synchronized videos of interactive behavior and facial expressions. UX_Mate is a modular system that tracks users’ facial expressions, interprets them based on pre-set rules, and generates predictions about the occurrence of target emotional states, which can be linked to interaction events. The system simplifies UX evaluation by providing indications of event occurrences. UX_Mate offers several advantages over other state-of-the-art systems: easy deployment in the user’s natural environment, avoidance of invasive devices, and significant cost reduction. The paper reports on a pilot and a validation study involving a total of 46 users, where UX_Mate was used to identify interaction difficulties. The studies show encouraging results that open possibilities for automatic real-time UX evaluation in ecological environments.
2014
16 citations
Emotion and Interface Design
Lockner & Bonnardel
The emotional experience of an interactive system has garnered significant interest in the HCI community. While predicting or controlling these experiences through design choices is challenging, our user study offers a different perspective. We found that certain controllable aspects of interactive products consistently elicited specific emotional responses from participants, despite influences from contextual factors. This paper discusses these findings and their implications for designing emotional experiences in interactive devices.
2015
58 citations
EEVEE: the Empathy-Enhancing Virtual Evolving Environment.
Jackson, Michon, Geslin, Carignan, Beaudoin
Empathy is a complex emotional and cognitive faculty often impaired in various psychopathologies, such as schizophrenia, and challenging to measure in real-world contexts. To address this, we developed the Empathy-Enhancing Virtual Evolving Environment , a platform comprising: avatars capable of expressing emotions at varying intensities based on the Facial Action Coding System ; systems for measuring observers’ physiological responses ; and a multimodal interface linking avatar behavior to observer responses. Validation data indicate that healthy adults can discern different negative emotions, including pain, expressed by avatars at varying intensities. Additionally, masking parts of an avatar’s face does not hinder the detection of different pain levels. EEVEE offers a unique tool to study and potentially modulate empathy in an ecological manner across various populations, notably those with neurological or psychiatric disorders.
2018
11 citations
Multimodal Observation and Classification of People Engaged in Problem Solving : Application to Chess Players
Guntz, Balzarini, Vaufreydaz, Crowley
In this paper, the authors present initial results from a pilot experiment aimed at interpreting multimodal observations of human experts engaged in solving challenging chess problems. The study investigates how observations of eye-gaze, posture, emotion, and other physiological signals can be utilized to model the cognitive state of subjects. It also explores the integration of multiple sensor modalities to enhance the reliability of detecting human displays of awareness and emotion. Potential applications for such cognitive model-based systems include promoting healthy autonomous aging and developing automated training systems. By observing chess players tackling problems of increasing difficulty and recording their behavior, the researchers aim to estimate participants’ situational awareness and predict their ability to respond effectively to challenging situations. Feature selection was performed to construct a multimodal classifier that relies on the most relevant features from each modality. Initial results indicate that eye-gaze, body posture, and emotion are effective features for capturing such awareness. This experiment also validates the use of the equipment as a general and reproducible tool for studying participants engaged in screen-based interaction and problem-solving.
2012
29 citations
Relating Perceived Web Page Complexity to Emotional Valence and Eye Movement Metrics
Goldberg
Initial impressions of visual complexity significantly impact both consumer and enterprise web page designs. To integrate complexity assessment methods into usability tools, a study was conducted comparing subjective ratings, eye tracking, JPEG-compressed file size, and emotional valence measures. Professional enterprise users performed search tasks and evaluated web page complexity. Multivariate factor analysis and ordinal logistic regressions on subjective ratings revealed that perceptions of page complexity were influenced by self-assessed search difficulty and page density. Lower complexity ratings correlated with increased fixation durations and reduced search areas. Facial analysis indicated that aggregated emotional valence improved with higher clarity ratings. Overall, both pre-attentive eye tracking and emotional valence measures were associated with conscious subjective complexity judgments. Further research is recommended to attribute complexity-inducing features to measurable qualities.
2014
66 citations
Measuring Software Screen Complexity: Relating Eye Tracking, Emotional Valence, and Subjective Ratings
Goldberg

Request a free trial

Get your free example report

Get your free whitepaper