Fast Explicit-Input Assistance for Teleoperation in Clutter

N. Walker, X. Yang, A. Garg, M. Cakmak, D. Fox, and C. Pérez-D’Arpino, “Fast Explicit-Input Assistance for Teleoperation in Clutter,” Mar. 2024.

Abstract

The performance of prediction-based assistance for robot teleoperation degrades in unseen or goal-rich environments due to incorrect or quickly-changing intent inferences. Poor predictions can confuse operators or cause them to change their control input to implicitly signal their goal. We present a new assistance interface for robotic manipulation where an operator can explicitly communicate a manipulation goal by pointing the end-effector. The pointing target specifies a region for local pose generation and optimization, providing interactive control over grasp and placement pose candidates. We compare the explicit pointing interface to an implicit inference-based assistance scheme in a within-subjects user study (N=20) where participants teleoperate a simulated robot to complete a multi-step singulation and stacking task in cluttered environments. We find that operators prefer the explicit interface, experience fewer pick failures and report lower cognitive workload. Our code is available at: github.com/NVlabs/fast-explicit-teleop.

BibTeX Entry

@article{walker2024explicit,
  author = {Walker, Nick and Yang, Xuning and Garg, Animesh and Cakmak, Maya and Fox, Dieter and P\'{e}rez-D'Arpino, Claudia},
  title = {Fast Explicit-Input Assistance for Teleoperation in Clutter},
  year = {2024},
  month = mar,
  eprint = {2402.02612},
  archiveprefix = {arXiv},
  primaryclass = {cs.RO},
  wwwtype = {working},
  wwwpdf = {https://arxiv.org/pdf/2402.02612.pdf},
  wwwcode = {https://github.com/NVlabs/fast-explicit-teleop}
}