Portfolio           CV           About

NASA SUITS Challenge

An AR and voice user interface for the next-gen NASA spacesuits







The NASA SUITS Challenge is a competition that asks the participants to design and develop spacesuit information displays within Augmented Reality (AR) environments. As NASA pursues Artemis—landing American astronauts on the Moon—it is essential that crewmembers on spacewalks are equipped with the appropriate human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration.

As a team, we developed an AR and voice interaction system using HoloLens 2 for the astronauts to conduct lunar surface exploration more easily. Our team was chosen as one of the finalists and we were invited to the Johnson Space Center to conduct on-site testing with NASA’s engineers to evaluate our project in May 2022.





How might we help astronauts to better navigate and conduct explorations on the moon?


The astronauts experience lots of challenges from being on the lunar surface, including lack of mobility due to wearing the pressurized suits and drastic lighting conditions. The current human-crewed lunar mission also largely rely on radio communication with NASA’s Mission Control Center, which lacks autonomy.

NASA wants to create a visual display and control system for the next-generation lunar spacesuit in order to make future extravehicular activities (EVAs) more efficient and effective. The goal is to assist extravehicular crew members perform navigation and system state monitoring tasks during a Moonwalk.


An AR and speech-based user interface to aid astronauts in communication, navigation, and lunar exploration


In our final solution, we decided to develop an AR program to assist astronauts in lunar tasks including navigation, science sampling and rescue. We provided both gestural and voice interactions which would allow the astronauts to have more autonomy and efficiency than before.







Long-Range Navigation


In the current state, EVA explorations rely on interfacing with IVA or MCC to give directions on navigating to sampling locations and other sites of interest. With our solution, the astronauts can now set their desired destinations on a lunar map, and the system will automatically guide the user to the destinations via a highlighted 3D path.

The route will also take into account registered danger zones, such as areas with craters, and navigate around them. If the hazard is within Moon Buddy’s field of vision, a 3D pop-up warning will come into view that labels the hazard, outlines the hazardous area in red, and shows how far away it is.







Science Sampling


To help teams increase the efficiency of their geology research during EVAs, our system has a dedicated sampling mode for harvesting moon rocks that can be activated via a voice command. Sampling mode will bring up a note-taking widget that can transcribe the astronaut’s voice notes into text, as well as a camera that can be utilized to attach a field of vision screenshot to the notes.







Search & Rescue


Under the harsh environment of the Moon, emergency conditions are significantly heightened.  If the astronaut’s vitals are below a certain threshold, the system will automatically send out distress messages through the emergency alert channel to notify the other crew members. Other crew members will receive the emergency message and be asked to take action to rescue, which will activate the long-range navigation mode and guide the user towards the astronaut’s location.







We are glad to be notified that our solution is selected as one of the finalists and we are invited to conduct on-site Human-in-the-loop user testing at Johnson Space Center in May 2022.


Our project was selected by NASA to conduct on-site testing at Johnson Space Center











Other Case Study



 
            
@ 2024 Anita Sun