As robots become integrated into human environments, they increasingly interact directly with people. This is particularly true for assistive robots, which help people through social interactions (like tutoring) or physical interactions (like preparing a meal). Developing effective human-robot interactions in these cases requires a multidisciplinary approach involving both fundamental algorithms from robotics and insights from cognitive science. My research brings together these two areas to extend the science of human-robot interaction, with a particular focus on assistive robotics. In the process of developing cognitively-inspired algorithms for robot behavior, I seek to answer fundamental questions about human-robot interaction: what makes a robot appear intelligent? How can robots communicate their internal states to human partners to improve their ability to collaborate? Vice versa, how can robots "read" human behaviors that reveal people's goals, intentions, and difficulties, to identify where assistance is required? 

In this talk, I describe my vision for robots that collaborate with humans on complex tasks by leveraging natural, intuitive human behaviors. I explain how models of human attention, drawn from cognitive science, can help select robot behaviors that improve human performance on a collaborative task. I detail my work on algorithms that predict people's mental states based on their eye gaze and provide assistance in response to those predictions. And I show how breaking the seamlessness of an interaction can make robots appear smarter. Throughout the talk, I will describe how techniques and knowledge from cognitive science help us develop robot algorithms that lead to more effective interactions between people and their robot partners.

Bio:
Henny Admoni is a postdoctoral fellow at the Robotics Institute at Carnegie Mellon University, where she investigates human-robot interaction with Siddhartha Srinivasa in the Personal Robotics Lab. Henny develops and studies intelligent robots that improve people's lives by providing assistance through social and physical interactions. She studies how nonverbal communication, such as eye gaze and pointing, can improve assistive interactions by revealing underlying human intentions and improving human-robot communication. Henny completed her PhD in Computer Science at Yale University with Professor Brian Scassellati. Her PhD dissertation was about modeling the complex dynamics of nonverbal behavior for socially assistive human-robot interaction. Henny holds an MS in Computer Science from Yale University, and a BA/MA joint degree in Computer Science from Wesleyan University. Henny's scholarship has been recognized with awards such as the NSF Graduate Research Fellowship, the Google Anita Borg Memorial Scholarship, and the Palantir Women in Technology Scholarship.