LESSONS LEARNED WHILE PORTING SIMULATED LUNAR XR UNITY PROJECT TO THE APPLE VISION PRO
Links to Files
Permanent Link
Author/Creator
Author/Creator ORCID
Date
Department
Program
Citation of Original Publication
Rights
This work was written as part of one of the author's official duties as an Employee of the United States Government and is therefore a work of the United States Government. In accordance with 17 U.S.C. 105, no copyright protection is available for such works under U.S. Law.
Public Domain
Public Domain
Subjects
Abstract
We will share with you the lessons that we learnt while porting the NASA open-source eXtended Reality (XR)software, the Mixed Reality Exploration Toolkit, to the Apple Vision Pro. Why on earth would anyone do such athing?! Because NASA wants to go back to the moon. NASA's Artemis Program aims to establish a permanenthuman presence near the lunar south pole. The Augmented Reality Data Visualization Analog Research Campaign(ARDVARC) project was funded to assess the use of Augmented Reality (AR) data visualization in both rover andastronaut analog missions using low-angle lighting conditions similar to those present at the lunar south pole. Findings will contribute to data visualization methods and concept of operations for future Artemis missions tothe lunar surface.The ARDVARC team needed a Mixed or Augmented Reality platform, that would allow them to augment terrainwith precisely positioned information and data, while using COTS devices. This project is simulating the lunarsouth pole by conducting operations at night at a field site illuminated by a strong light that matches the sunangle and long dark shadows at the lunar south pole. These extreme conditions required an extreme headset,the Apple Vision Pro that can track its own position using embedded LiDAR sensors. ARDVARC used a rovermountedlidar to scan the Cinder Lake Crater Fields near Flagstaff, Arizona for development and later on-sitetesting, while MRET was ported to the AVP and extended with additional capabilities to fulfill the mission needs:real-time data visualizations including an overlay of the rover traverse path onto the real world terrain, overlay of elevation or slope data, and an overlay of the nominal astronaut traverse path or waypoints to assist withwayfinding during the analog Extra Vehicular Activities (EVA’s).This talk will focus on ARDVARC and the challenges and lessons learned from porting a large Unity project to theApple Vision Pro. After the talk, please come for the demo!
