Browsing

Publication Tag: User Experience

An overview of all publications that have the tag you selected.

2017
39 citations
The Emotional, Cognitive, Physiological, and Performance Effects of Variable Time Delay in Robotic Teleoperation
Yang & Dorneich
The study investigates the impact of intermittent and variable time delays on operators during robotic teleoperation tasks. Participants navigated a remote-controlled robot through mazes of varying complexity while identifying targets. Introducing feedback lag led to increased frustration, anger, and cognitive workload, while decreasing usability and task performance. The effects of variable time delay were more pronounced than those of task complexity, and their combined impact was additive. Understanding these emotional and physiological responses is crucial for designing robotic systems that can effectively mitigate negative operator states.
2016
8 citations
Toward physiological indices of emotional state driving future ebook interactivity
van Erp, Hogervorst, van der Werf, Ysbrand
Ebooks of the future may respond to the emotional experience of the reader. physiological measures could capture a reader’s emotional state and use this to enhance the reading experience by adding matching sounds or to change the storyline, thereby creating a hybrid art form between literature and gaming. We describe the theoretical foundation of the emotional and creative brain and review the neurophysiological indices that can be used to drive future ebook interactivity in a real-life situation. As a case study, we report the neurophysiological measurements of a bestselling author during nine days of writing, which can potentially be used later to compare them to those of the readers. In designated calibration blocks, the artist wrote emotional paragraphs for emotional pictures. Analyses showed that we can reliably distinguish writing blocks from resting, but we found no reliable differences related to the emotional content of the writing. The study shows that measurements of EEG, heart rate , skin conductance, facial expression, and subjective ratings can be done over several hours a day and for several days in a row. In follow-up phases, we will measure 300 readers with a similar setup.
2012
56 citations
UX_Mate: From Facial Expressions to UX Evaluation
Staiano, Menéndez, Battocchi, De Angeli, Sebe
In this paper, the authors propose and evaluate UX_Mate, a non-invasive system for the automatic assessment of User eXperience . Additionally, they contribute a novel database of annotated and synchronized videos of interactive behavior and facial expressions. UX_Mate is a modular system that tracks users’ facial expressions, interprets them based on pre-set rules, and generates predictions about the occurrence of target emotional states, which can be linked to interaction events. The system simplifies UX evaluation by providing indications of event occurrences. UX_Mate offers several advantages over other state-of-the-art systems: easy deployment in the user’s natural environment, avoidance of invasive devices, and significant cost reduction. The paper reports on a pilot and a validation study involving a total of 46 users, where UX_Mate was used to identify interaction difficulties. The studies show encouraging results that open possibilities for automatic real-time UX evaluation in ecological environments.
2014
16 citations
Emotion and Interface Design
Lockner & Bonnardel
The emotional experience of an interactive system has garnered significant interest in the HCI community. While predicting or controlling these experiences through design choices is challenging, our user study offers a different perspective. We found that certain controllable aspects of interactive products consistently elicited specific emotional responses from participants, despite influences from contextual factors. This paper discusses these findings and their implications for designing emotional experiences in interactive devices.
2014
66 citations
Measuring Software Screen Complexity: Relating Eye Tracking, Emotional Valence, and Subjective Ratings
Goldberg
2011
29 citations
Methodologies for Evaluating Player Experience in Game Play
Chu, Yin Wong, Weng Khong
Player experience constitutes one of the most significant factors in determining the success rate of games. Games which do not provide enormous user experience usually will not gain intense interest from players. The concept of player experience is normally interchanged with concepts such as fun, flow, fulfillment, enjoyment, engagement, satisfaction, pleasure and playability. In this paper, we reviewed, analyzed and discussed the different attributes and methodologies used to evaluate player experience for game play. We concluded the finding in a playability matrix based on an analysis of methodologies for evaluating player experience in game play. The matrix was constructed from literature analysis, which comprised of attributes consisting of qualitative and quantitative, verbal and non-verbal, empirical and non-empirical methods.
2015
56 citations
Presence of life-like robot expressions influences children’s enjoyment of human-robot interactions in the field
Cameron, Fernando, Collins, Milings, Moore, Sharkey, Evers, Prescott
Emotions and their expression significantly influence social interactions, making them crucial in the development of social robots. This study, part of a collaborative EU project, investigated how lifelike affective facial expressions in the humanoid robot Zeno affect children’s behaviors and attitudes. Findings revealed gender-based differences: male participants exhibited positive affective responses and a greater liking for the robot when it displayed both positive and negative facial expressions during an interactive game, compared to a neutral expression. Female participants showed no significant differences between conditions. This research is the first to demonstrate the impact of lifelike emotional expressions on children’s behavior in real-world settings. The study discusses broader implications, emphasizing gender differences in human-robot interaction and the importance of the robot’s gender appearance . It contributes to advancing the understanding of how interactions with expressive robots can foster task-appropriate symbiotic relationships.
2011
30 citations
Refining a User Behaviour Model Based on the Observation of Emotional States Refining a User Behaviour Model Based on the Observation of Emotional States
Aguiar, Veiera, Galy, Mercantini, Santoni
This paper presents a refined user behavior model that incorporates the observation of emotional states to enhance the understanding of user interactions. By integrating emotional state data, the model aims to provide a more comprehensive representation of user behavior, which can be applied to improve user experience and system design. The study demonstrates the effectiveness of this approach through various applications and discusses its implications for future research in user behavior modeling.
2021
14 citations
Complex website tasks increase the expression anger measured with FaceReader online
L. Talen and T.E. den Uyl
To stand out among the large variations of websites that exist, users should have a good experience. Earlier research found that a good experience influences important user behavior statistics, such as further use of the website. Complexity seems to play a role in the usability of websites and a blockage or delay in reaching the goal leads to negative feelings such as frustration. In this study, FaceReader Online, a tool to measure facial expressions via the internet, was used to measure the effect of the complexity of website design and website tasks on the facial expression anger. Because the expression scores had a low intensity, we calculated a metric for the peak expression. The results indicated that in the more complex tasks the facial expression of anger was higher. These results suggest that the automatically detected facial expression anger could be used to measure usability aspects of a website.
16 citations
Enhancing emotion recognition in VIPs with haptic feedback
H.P. Buimer, M. Bittner, T. Kostelijk, T.M. van der Geest, R. J. A. van Wezel and Y. Zhao
The rise of smart technologies has created new opportunities to support blind and visually impaired persons . One of the biggest problems identified in previous research on problems VIPs face during activities of daily life concerned the recognition of persons and their facial expressions. In this study, a system was developed to detect faces, recognize their emotions, and provide vibrotactile feedback about the emotions expressed. The prototype system was tested to determine whether vibrotactile feedback through a haptic belt is capable of enhancing social interactions for VIPs. The system consisted of commercially available technologies: a Logitech C920 webcam mounted on a cap, a Microsoft Surface Pro 4 carried in a mesh backpack, an Elitac tactile belt worn around the waist, and the VicarVision FaceReader software application, which recognizes facial expressions. In preliminary tests with the system, both visually impaired and sighted persons were presented with sets of stimuli consisting of actors displaying six emotions derived from the validated Amsterdam Dynamic Facial Expression Set and Warsaw Set of Emotional Facial Expression Pictures with matching audio using nonlinguistic affect bursts. Subjects had to determine the emotions expressed in the videos without and, after a training period, with haptic feedback. An exit survey was conducted to gain insights into the opinion of the users on the perceived usefulness and benefits of the emotional feedback, and their willingness to use the prototype as assistive technology in daily life. Haptic feedback about facial expressions may improve the ability of VIPs to determine emotions expressed by others and, as a result, increase the confidence of VIPs during social interactions. More studies are needed to determine whether this is a viable method to convey information and enhance social interactions in the daily life of VIPs.

Request a free trial

Get your free example report

Get your free whitepaper