Gaze Controlled Interactions for AR/VR

Author/Creator

Author/Creator ORCID

Date

2020-07-17

Type of Work

Department

University of Baltimore. Yale Gordon College of Arts and Sciences

Program

University of Baltimore. Master of Science in Interaction Design and Information Architecture

Citation of Original Publication

Rights

Attribution-NonCommercial-NoDerivs 3.0 United States
This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by the University of Baltimore for non-commercial research and educational purposes.

Abstract

This simulation study assessed the impact of gaze and hand gestures as a primary input method for the aerospace industry by utilizing Augmented Reality (AR) and Virtual Reality (VR). Fifteen NASA employees with a varying range of AR/VR expertise participated in an interactive simulation of driving a rover on Mars using a mechanical robot, 360 Fly Camera, HTC Vive, and Leap Motion. Each participant received two prototypes to interact with, a control and an experiment, where they were instructed to drive a rover simulation by interacting with on-screen buttons for the control and directional hand gestures for the experiment. Due to the limitation in technology, eye movements were not used. Participants were evaluated on rate of success when navigating between 3D printed rockets, perceived ease of use, and qualitative preference. AR/VR slightly impaired the perception of target objects in the physical environment. Results exhibited preference for the hand gestures with a near significant rate of success and perceived ease of use despite the small sample size.