Virtual Reality and Photogrammetry for Improved Reproducibility of Human-Robot Interaction Studies

dc.contributor.authorMurnane, Mark
dc.contributor.authorBreitmeyer, Max
dc.contributor.authorMatuszek, Cynthia
dc.contributor.authorEngel, Don
dc.date.accessioned2019-10-04T14:31:32Z
dc.date.available2019-10-04T14:31:32Z
dc.date.issued2019-08-15
dc.description2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)en_US
dc.description.abstractCollecting data in robotics, especially human-robot interactions, traditionally requires a physical robot in a prepared environment, which presents substantial scalability challenges. First, robots provide many possible points of system failure, while the availability of human participants is limited. Second, for tasks such as language learning, it is important to create environments which provide interesting, varied use cases. Traditionally, this requires prepared physical spaces for each scenario being studied. Finally, the expense associated with acquiring robots and preparing spaces places serious limitations on the reproducible quality of experiments. We therefore propose a novel mechanism for using virtual reality to simulate robotic sensor data in a series of prepared scenarios. This allows for a reproducible data set which other labs can recreate using commodity VR hardware. The authors demonstrate the effectiveness of this approach with an implementation that includes a simulated physical context, a reconstruction of a human actor, and a reconstruction of a robot. This evaluation shows that even a simple “sandbox” environment allows us to simulate robot sensor data, as well as the movement (e.g. view-port) and speech of humans interacting with the robot in a prescribed scenarioen_US
dc.description.sponsorshipThis material is based upon work supported by the National Science Foundation under Grant No. 1531491 and by Next Century Corporation.en_US
dc.format.extent2 pagesen_US
dc.genreconference papers and proceedings preprintsen_US
dc.identifierdoi:10.13016/m2jl4c-87zk
dc.identifier.citationM. Murnane, M. Breitmeyer, C. Matuszek and D. Engel, "Virtual Reality and Photogrammetry for Improved Reproducibility of Human-Robot Interaction Studies," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 2019, pp. 1092-1093. doi: 10.1109/VR.2019.8798186. URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8798186&isnumber=8797678en_US
dc.identifier.urihttps://doi.org/10.1109/VR.2019.8798186
dc.identifier.urihttp://hdl.handle.net/11603/14974
dc.language.isoen_USen_US
dc.publisherIEEEen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.relation.ispartofUMBC Office for the Vice President of Research
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.rights© 2019 IEEE.
dc.subjectVirtual Realityen_US
dc.subjectVRen_US
dc.subjectPhotogrammetryen_US
dc.subjectHuman-Robot Interactionen_US
dc.subjectVirtual Presenceen_US
dc.subjectH.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, Augmented and Virtual Realitiesen_US
dc.subjectI.2.9 [Artificial Intelligence]: Robotics—Operator Interfacesen_US
dc.subjectI.2.9 [Artificial Intelligence]: Robotics—Sensorsen_US
dc.titleVirtual Reality and Photogrammetry for Improved Reproducibility of Human-Robot Interaction Studiesen_US
dc.typeTexten_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Virtual Reality and Photogrammetry for Improved Reproducibility.pdf
Size:
134.76 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: