Let Me Finish First – The Effect of Interruption-Handling Strategy on the Perceived Personality of a Social Agent
This paper presents an experiment with three artificial agents adopting different strategies when being interrupted by human conversational partners. The agent either ignored the interruption (the most common behavior in conversational engines to date), yielded the turn to the human conversational partner right away, or acknowledged the interruption, finished its thought and then responded to the content of the interruption. Our results show that this change in the agent's conversational behavior had a significant impact on which personality traits people assigned to the agent, as well as how much they enjoyed interacting with it. Moreover, the data also indicates that human interlocutors adapted their own conversational behavior. Our findings suggest that the interactive behavior of an artificial agent should be carefully designed to match its desired personality and the intended conversational dynamics.
Name Pronunciation Extraction and Reuse in Human-Agent Conversation
An Automatic Evaluation Framework for Social Conversations with Robots
Improving a Robot’s Turn-Taking Behavior in Dynamic Multiparty Interactions
We present ongoing work to develop a robust and natural turn-taking behavior for a social agent to engage a dynamically changing group in a conversation.
Improving VIP Viewer Gaze Estimation and Engagement Using Adaptive Dynamic Anamorphosis
Anamorphosis for 2D displays can provide viewer centric perspective viewing, enabling 3D appearance, eye contact and engagement, by adapting dynamically in real time to a single moving viewer’s viewpoint, but at the cost of distorted viewing for other viewers. We present a method for constructing non-linear projections as a combination of anamorphic rendering of selective objects whilst reverting to normal perspective rendering of the rest of the scene. Our study defines a scene consisting of five characters, with one of these characters selectively rendered in anamorphic perspective.
FaceMagic: Real-Time Facial Detail Effects on Mobile
We present a novel real-time face detail reconstruction method capable of recovering high quality geometry on consumer mobile devices.
Smile and Laugh Dynamics in Naturalistic Dyadic Interactions: Intensity Levels, Sequences and Roles
Smiles and laughs have been the subject of many studies over the past decades, due to their frequent occurrence in interactions, as well as their social and emotional functions in dyadic conversations. In this paper we push forward previous work by providing a first study on the influence one interacting partner’s smiles and laughs have on their interlocutor’s, taking into account these expressions’ intensities. Our second contribution is a study on the patterns of laugh and smile sequences during the dialogs, again taking the intensity into account. Finally, we discuss the effect of the interlocutor’s role on smiling and laughing. In order to achieve this, we use a database of naturalistic dyadic conversations which was collected and annotated for the purpose of this study.
Dynamic Emotional Language Adaptation in Multiparty Interactions with Agents
In order to achieve more believable interactions with artificial agents, there is a need to produce dialogue that is not only relevant, but also emotionally appropriate and consistent. This paper presents a comprehensive system that models the emotional state of users and an agent to dynamically adapt dialogue utterance selection.
The Role of Closed-Loop Hand Control in Handshaking Interactions
In this paper we investigate the role of haptic feedback in human/robot handshaking by comparing different force controllers. The basic hypothesis is that in human handshaking force control there is a balance between an intrinsic (open--loop) and extrinsic (closed--loop) contribution. We use an underactuated anthropomorphic robotic hand, the Pisa/IIT hand, instrumented with a set of pressure sensors estimating the grip force applied by humans. In a first set of experiments we ask subjects to mimic a given force profile applied by the robot hand, to understand how human perceive and are able to reproduce a handshaking force.
On the Role of Stiffness and Synchronization in Human-Robot Handshaking
This paper presents a system for soft human-robot handshaking, using a soft robot hand in conjunction witha lightweight and impedance-controlled robot arm. Using this system, we study how different factors influencethe perceived naturalness, and give the robot different personality traits. Capitalizing on recent findings regardinghandshake grasp force regulation, and on studies of the impedance control of the human arm, we investigate the roleof arm stiffness as well as the kinaesthetic synchronization of human and robot arm motions during the handshake.The system is implemented using a lightweight anthropomorphic arm, with a Pisa/IIT Softhand wearing a sensorizedsilicone glove as the end-effector.
Page 1 of 7