NASA SUITS
Sponsored by NASA, we completed the Spacesuit User Interface Challenge 2022, which focused on leveraging mixed-reality technologies to address key aspects of the Artemis mission, including navigation, lunar search and rescue, and geology solutions. We developed our multi-modal mixed-reality system HOSHI from 0 to 1, and brought it to the NASA Johnson Space center in May 2022.
Goals
To develop novel approaches to address Artemis mission challenges and showcase the potential of mixed-reality solutions
To envision future interfaces for lunar missions to inspire and revolutionize the human spaceflight experience
What I did
Conducted user research and the whole design end-to-end in a group of four
Acted as the main designer for the map and navigation portion
Supported the engineering team in an iterative process
VEGA the intelligent voice system
Navigation path and aids
Geology sampling and note taking through headset
The full demo video
The Process
Understanding context, NASA's requirements, and astronauts' needs
Literature Review
200+ pages of reading and synthesis to learn about the extravehicular activity (EVA) and the capability of xEVA suit
Mission Briefing
5 NASA mission briefing sessions to get to know what NASA prioritizes when designing to support lunar missions
User Interview
6 interviews with former astronauts and NASA specialists to understand pain points of current operations on EVA
Designing for space work using AR was new to us, so from research findings we established a set of design tenets to remind us of key constraints and considerations for this unique problem space. From our synthesis session, we developed guiding principles to inform future design decisions. We also identified major user pain points by synthesizing research findings.
Design tenets
Safety first
The lunar environment is harsh and unpredictable, requiring astronauts' well-being to be the top priority and a pre-requisite of the mission success.
Functionality over learnability
Astronauts are intelligent and undergo extensive training and can effectively acquire any necessary skills to operate complex systems.
Accessibility in mind
Working on the moon can be compared to dealing with situational impairments, with micro-gravity environment, spacesuit, tools, and extreme conditions.
Assistive AR
Astronauts follow specific procedures and have limited cognitive and physical resources, so novel technologies should only be introduced when clearly beneficial.
Astronauts' pain points
Sound focused, no visual cue
Currently, all that astronauts receive from the spacesuit is primarily an audio-based system, lacking any form of a visual input-output system.
Lack of situational awareness
The lighting conditions and absence of familiar references on the moon make it challenging for astronauts to accurately perceive and navigate their surroundings.
Limited autonomy
Astronauts rely heavily on commands from the mission control center, limiting their autonomy and can hinder quicker decision-making, especially in unexpected situations.
Key iterations
The evolvement of the menu
We began by brainstorming menus as the entry point for all system functionalities. Drawing inspiration from 2D interfaces, mixed-reality toolkits, games, and vehicle heads-up displays, we developed multiple menu options.
A key challenge we faced when deciding on the menu was uncertainty about the best UI practices for designing in mixed reality. In 2021, with extended reality design guidelines still emerging, we employed physical prototyping as a low-cost method to explore basic concepts like depth, size, and movement of interface elements.
Through physical prototyping, we realized that a large menu obstructing the main view could pose risks while astronauts are moving. To mitigate this, we opted for a minimal menu positioned at the edge of the field of view. We retained visual access (with is also accessible with eye gaze) to the menu to accommodate situations where voice or gesture controls are unavailable.
The evolvement of the map
To enhance situational awareness and help astronauts identify their orientations, I initially ideated map solutions, including a large map view for route planning or overviews, and a game-like mini-map to indicate astronauts' current positions.
Similarly, physical prototyping helped me determine best practices for incorporating a map in mixed reality and the potential challenges associated with using a mini-map as a user interface element.
A fixed mini-map on the screen could obstruct the field of view, conflicting with our design principles of safety and assistive AR. Although a large overview map can be helpful when astronauts are comparatively stationary, we shifted our approach to integrate the mini-map concept as environment-based directional aids. To refine the visual language and ensure it benefits astronauts, we conducted remote testing with NASA specialists.
During remote testing, specialists provided feedback favoring arrows for directional guidance over minimal dots and validated concepts such as using a range indicator for geological sites.
The evolvement of the navigation path
Confirmed by astronauts, a navigation path would be beneficial in the lunar mission scenario. Drawing inspiration from mobile and vehicle AR, as well as computer vision displays, we brainstormed various visual representations for the navigation path.
Again, physical prototyping provided the input that navigation path on the ground actually might be perceived as something getting in the way. To decide on a good placement other than on the ground, we collaborated with the software team to build it in Unity and deploy it to HoloLens 2 as a standard team approach to determine object placement and display. After testing various positions, we found that placing the sphere navigation path slightly above eye level was ideal, as it avoided appearing intrusive to the user.
Our own method to communicate design
We started with Figma to prototype the workflow, following a more standard approach. We also came up a template to combine interface, depth and voice control all at once.
Final testing at NASA Johnson Space Center
Our final testing took place in late May 2022 at the Johnson Space Center rock yard, simulating conditions similar to EVAs. We conducted two rounds of testing with NASA specialists. We received excellent feedback on our environment-integrated design, particularly on features like the navigation aids.
Takeaways
Navigating ambiguity
Entering an unfamiliar domain and working with new technology was challenging but also exciting—like exploring the universe. It required thorough research to understand the problem space and design context. In the end, our efforts paid off, leading to the creation of a user-centric product.
Embracing change and adaptation
Throughout the design and development process, we encountered shifting priorities, unexpected testing results, technical challenges, and environmental constraints. Being ready to adapt quickly was key to overcoming these complexities and finding effective solutions.