A Simulator for Human-Robot Interaction in Virtual Reality

Date

2021-05-06

Department

Program

Citation of Original Publication

M. Murnane, P. Higgins, M. Saraf, F. Ferraro, C. Matuszek and D. Engel, "A Simulator for Human-Robot Interaction in Virtual Reality," 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2021, pp. 470-471, doi: 10.1109/VRW52623.2021.00117.

Rights

© 2021 IEEE.  Personal use of this material is permitted.  Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Abstract

We present a suite of tools to model a robot, its sensors, and the surrounding environment in VR, with the goal of collecting training data for real-world robots. The virtual robot observes a rigged avatar created in our photogrammetry facility and embodying a VR user. We are particularly interested in verbal human/robot interactions, which can be combined with the robot's sensor data for grounded language learning. Because virtual scenes, tasks, and robots are easily reconfigured compared to their physical analogs, our approach proves extremely versatile in preparing a wide range of robot scenarios for an array of use cases.