Human-Robot Interaction

Along with models of human behavior, agents need algorithms to decide how to act and communicate while interacting with humans. Timely decisions are critical, as high response times may lead to sluggish interaction or miscoordination. Gauging the right level of communication is challenging as too little communication limits performance, while too much of it may cause information overload. Moreover, agents do not have complete knowledge of their environment and cannot directly observe humans’ mental states. Thus, interaction necessitates novel algorithms that can reason with incomplete models, partial observability, and limited planning time.

Towards this challenge, I am developing novel computational frameworks that enable this challenging decision-making during human-robot interactions. Two of these frameworks, AdaCoRL and CommPlan, have been applied to sequential human-robot dyadic tasks with large problem sizes (over one million states), short planning times (less than a second) and temporally-extended actions. Ongoing research along this thread explores decision-making in increasingly general contexts, including human-robot groups.


  1. ICRA
    Human-Guided Motion Planning in Partially Observable Environments
    Carlos Quintero-Pena*, Constantinos Chamzas*, Zhanyi Sun, Vaibhav Unhelkar, Lydia E Kavraki
    In International Conference on Robotics and Automation (ICRA), 2022
  2. HRI
    Decision-Making for Bidirectional Communication in Sequential Human-Robot Collaborative Tasks
    Vaibhav Unhelkar*, Shen Li*, Julie Shah
    In ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2020
  3. Effective Information Sharing for Human-Robot Collaboration
    Doctoral dissertation, Massachusetts Institute of Technology, 2020
  4. CoRL
    Semi-Supervised Learning of Decision-Making Models for Human-Robot Collaboration
    Vaibhav Unhelkar*, Shen Li*, Julie Shah
    In Conference on Robot Learning, 2019
  5. IJCAI
    Learning and Communicating the Latent States of Human-Machine Collaboration.
    Vaibhav Unhelkar, Julie Shah
    In International Joint Conference on Artificial Intelligence (IJCAI), 2018
  6. RA-M
    Mobile Robots for Moving-Floor Assembly Lines
    Vaibhav Unhelkar, Stefan Dörr, Alexander Bubeck, Przemyslaw A Lasota, Jorge Perez, Ho Chit Siu, James C Boerkoel Jr, Quirin Tyroller, Johannes Bix, Stefan Bartscher, Julie Shah
    IEEE Robotics & Automation Magazine (RA-M) 1070, 2018
  7. RA-L
    Human-Aware Robotic Assistant for Collaborative Assembly: Integrating Human Motion Prediction with Planning in Time
    Vaibhav Unhelkar*, Przemyslaw A Lasota*, Quirin Tyroller, Rares-Darius Buhai, Laurie Marceau, Barbara Deml, Julie Shah
    IEEE Robotics and Automation Letters (RA-L) 3 (3), 2018
  8. AAAI
    ConTaCT: Deciding to Communicate in Time-critical, Collaborative Tasks in Unknown, Deterministic Domains
    Vaibhav Unhelkar, Julie Shah
    In AAAI Conference on Artificial Intelligence (AAAI), 2016
  9. ICRA
    Human-robot co-navigation using anticipatory indicators of human walking motion
    Vaibhav Unhelkar*, C. Perez-D’Arpino*, L. Stirling, Julie Shah
    In International Conference on Robotics and Automation (ICRA), 2015
  10. HRI Companion
    Challenges in Developing a Collaborative Robotic Assistant for Automotive Assembly Lines
    Vaibhav Unhelkar, Julie Shah
    In ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, HRI’15 Extended Abstracts, 2015
  11. HRI
    Comparative Performance of Human and Mobile Robotic Assistants in Collaborative Fetch-and-deliver Tasks
    Vaibhav Unhelkar, Ho Chit Siu, Julie Shah
    In ACM/IEEE International Conference on Human-Robot Interaction, HRI, 2014