Smile and Laugh Dynamics in Naturalistic Dyadic Interactions: Intensity Levels, Sequences and Roles

Smiles and laughs have been the subject of many studies over the past decades, due to their frequent occurrence in interactions, as well as their social and emotional functions in dyadic conversations. In this paper we push forward previous work by providing a first study on the influence one interacting partner’s smiles and laughs have on their interlocutor’s, taking into account these expressions’ intensities. Our second contribution is a study on the patterns of laugh and smile sequences during the dialogs, again taking the intensity into account. Finally, we discuss the effect of the interlocutor’s role on smiling and laughing. In order to achieve this, we use a database of naturalistic dyadic conversations which was collected and annotated for the purpose of this study.

Learn More


Realistic and Interactive Robot Gaze

This paper describes the development of a system for lifelike gaze in human-robot interactions using a humanoid animatronic bust. We present a general architecture that seeks not only to create gaze interactions from a technological standpoint, but also through the lens of character animation where the fidelity and believability of motion is paramount; that is, we seek to create an interaction which demonstrates the illusion of life.

Learn More

PoseMMR: A Collaborative Mixed Reality Authoring Tool for Character Animation

Augmented reality devices enable new approaches for character animation, e.g., given that character posing is three dimensional in nature it follows that interfaces with higher degrees-of-freedom (DoF) should outperform 2D interfaces. We present PoseMMR, allowing Multiple users to animate characters in a Mixed Reality environment, like how a stop-motion animator would manipulate a physical puppet, frame-by-frame, to create the scene. We explore the potential advantages of the PoseMMR can facilitate immersive posing, animation editing, version control and collaboration, and provide a set of guidelines to foster the development of immersive technologies as tools for collaborative authoring of character animation.

Learn More

The Role of Closed-Loop Hand Control in Handshaking Interactions

In this paper we investigate the role of haptic feedback in human/robot handshaking by comparing different force controllers. The basic hypothesis is that in human handshaking force control there is a balance between an intrinsic (open--loop) and extrinsic (closed--loop) contribution. We use an underactuated anthropomorphic robotic hand, the Pisa/IIT hand, instrumented with a set of pressure sensors estimating the grip force applied by humans. In a first set of experiments we ask subjects to mimic a given force profile applied by the robot hand, to understand how human perceive and are able to reproduce a handshaking force.

Learn More

On the Role of Stiffness and Synchronization in Human-Robot Handshaking

This paper presents a system for soft human-robot handshaking, using a soft robot hand in conjunction witha lightweight and impedance-controlled robot arm. Using this system, we study how different factors influencethe perceived naturalness, and give the robot different personality traits. Capitalizing on recent findings regardinghandshake grasp force regulation, and on studies of the impedance control of the human arm, we investigate the roleof arm stiffness as well as the kinaesthetic synchronization of human and robot arm motions during the handshake.The system is implemented using a lightweight anthropomorphic arm, with a Pisa/IIT Softhand wearing a sensorizedsilicone glove as the end-effector.

Learn More

MakeSense: Automated Sensor Design for Proprioceptive Soft Robots

Soft robots have applications in safe human-robot interactions, manipulation of fragile objects, and locomotion in challenging and unstructured environments. In this paper, we present a computational method for augmenting soft robots with proprioceptive sensing capabilities. Our method automatically computes a minimal stretch-receptive sensor network to user-provided soft robotic designs, which is optimized to perform well under a set of user-specified deformation-force pairs. The sensorized robots are able to reconstruct their full deformation state, under interaction forces. We cast our sensor design as a sub-selection problem, selecting a minimal set of sensors from a large set of fabricable ones which minimizes the error when sensing specified deformation-force pairs. Unique to our approach is the use of an analytical gradient of our reconstruction performance measure with respect to selection variables. We demonstrate our technique on a bending bar and gripper example, illustrating more complex designs with a simulated tentacle.

Learn More