Object Transfer Point Estimation for Prompt Human to Robot Handovers

Handing over objects is the foundation of many human-robot interaction and collaboration tasks. In the scenario where a human is handing over an object to a robot, the human chooses where the object needs to be transferred. The robot needs to accurately predict this point of transfer to reach out pr...

Full description

Bibliographic Details
Main Author: Nemlekar, Heramb
Other Authors: Zhi Li, Advisor
Format: Others
Published: Digital WPI 2019
Subjects:
Online Access:https://digitalcommons.wpi.edu/etd-theses/1297
https://digitalcommons.wpi.edu/cgi/viewcontent.cgi?article=2296&context=etd-theses
Description
Summary:Handing over objects is the foundation of many human-robot interaction and collaboration tasks. In the scenario where a human is handing over an object to a robot, the human chooses where the object needs to be transferred. The robot needs to accurately predict this point of transfer to reach out proactively, instead of waiting for the final position to be presented. We first conduct a human-to-robot handover motion study to analyze the effect of user height, arm length, position, orientation and robot gaze on the object transfer point. Our study presents new observations on the effect of robot's gaze on the point of object transfer. Next, we present an efficient method for predicting the Object Transfer Point (OTP), which synthesizes (1) an offline OTP calculated based on human preferences observed in the human-robot motion study with (2) a dynamic OTP predicted based on the observed human motion. Our proposed OTP predictor is implemented on a humanoid nursing robot and experimentally validated in human-robot handover tasks. Compared to using only static or dynamic OTP estimators, it has better accuracy at the earlier phase of handover (up to 45% of the handover motion) and can render fluent handovers with a reach-to-grasp response time (about 3.1 secs) close to natural human receiver's response. In addition, the OTP prediction accuracy is maintained across the robot's visible workspace by utilizing a user-adaptive reference frame.