Gaze Controlled Interactions for AR/VR
Author/Creator
Date
2020-07-17Type of Work
62 leavesapplication/pdf
Text
theses
Department
University of Baltimore. Yale Gordon College of Arts and SciencesProgram
University of Baltimore. Master of Science in Interaction Design and Information ArchitectureRights
Attribution-NonCommercial-NoDerivs 3.0 United StatesThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by the University of Baltimore for non-commercial research and educational purposes.
http://creativecommons.org/licenses/by-nc-nd/3.0/us/
Subjects
Augmented RealityVirtual Reality
Gaze
Human computer interaction
Interaction Design
Mixed Reality
Aerospace
Gesture
Abstract
This simulation study assessed the impact of gaze and hand gestures as a primary input method for the aerospace industry by utilizing Augmented Reality (AR) and Virtual Reality (VR). Fifteen NASA employees with a varying range of AR/VR expertise participated in an interactive simulation of driving a rover on Mars using a mechanical robot, 360 Fly Camera, HTC Vive, and Leap Motion. Each participant received two prototypes to interact with, a control and an experiment, where they were instructed to
drive a rover simulation by interacting with on-screen buttons for the control and directional hand gestures for the experiment. Due to the limitation in technology, eye movements were not used. Participants were evaluated on rate of success when navigating between 3D printed rockets, perceived ease of use, and qualitative preference. AR/VR slightly impaired the perception of target objects in the physical environment. Results exhibited preference for the hand gestures with a near significant rate of success
and perceived ease of use despite the small sample size.
The following license files are associated with this item:
- Creative Commons