Project Date: 
April 2016

Robotics is becoming more and more involved in our lives, from working alongside us in factories, to empowering us to remain independent as we age. Despite working around us, robots tend to be purely functional entities. They go about their tasks methodically, only minding humans when they need to coordinate with us to achieve some task; little is known about how to make robots relatable to humans, a topic at the core of the People and Robots Initiative.

The goal of this project is for robots to leverage "body language" to express their state of awareness, hesitation, excitement,
disappointment, etc. The project is a collaboration between roboticists, computer graphics experts and professional dancers. The team will develop and test methods for transferring motion capture data of a human dancer expressing emotions to a robot arm in a manner that preserves the emotional content of the motion. Experiments will use the PR2 and Kinova arm to evaluate these methods with human subjects via the Amazon Mechanical Turk platform.