UX DESIGN CASE STUDY
NASA:
Wearable EVA User Interface (UI)
In the future, extravehicular activities (EVA) such as exploration missions will take place much further from Earth than in the past. Astronauts involved will require greater autonomy and need to be able to monitor important health & suit information, mission procedures, navigation, and communications systems. The goal of this prototype user interface was to develop an initial concept that is usable, intuitive, modular, and device agnostic.
Context
This project was completed through the Bioastronautics contract to provide services for the International Space Station and Orion programs with a focus on health, safety, and productivity of astronauts in space.
The working group was within Human Systems Engineering under the Human Health and Performance directorate.
Problem space
EVAs away from Earth orbit or future habitats will demand increased autonomy and safety precautions. The astronauts need to be able to quickly find and interpret health and mission information for both themselves and their EVA teammates. The EMU (extravehicular mobility unit) spacesuits used for current EVAs display very limited information, showing only twelve alphanumeric characters at a time.
For mission procedures, they have a notebook attached to the cuff of their gloves. This prototype was designed as a proof of concept for a user interface (UI) to be used in the form factor of an arm-cuff display and a head-mounted display.
User definition
The users will be astronauts participating in EVAs. They will be highly trained and specialized individuals with years of physical and mental training prior to any EVA attempts. These astronauts will be exploring new and harsh environments, with limited physical maneuverability and hardware due to suit limitations. Their lives will be on the line, and monitoring their health and suit consumables: oxygen (O2), carbon dioxide scrubber (CO2), battery (Batt), and cooling water (H2O) is essential for their survival.
Team & role
With payload, physical space, and budget at a premium, each NASA project requires input from a variety of professionals to design the most efficient solution. This UI needed and will continue to need input from teams specializing in fields such as human factors, software, and hardware engineering. For this prototype, I served as the designer, interning for a human factors engineer specializing in displays and controls, Aniko Sandor, PhD. The project began before my involvement, and I was tasked to iterate on the current prototype's design. These iterations were based on findings of previous research and interviews with astronaut crewmembers and EVA subject matter experts (SMEs). For the new versions, I needed to document and justify changes, getting feedback from SMEs through interviewing and user testing along the way.
Design process
Before starting the iteration cycles, I worked on defining the high-level agnostic aspects of the design process.
Constraints
Astronauts are on a strict schedule and were unfortunately unavailable for usability testing or in-depth interviews for this phase of the project. Instead, we conducted our tests with human factor engineers with backgrounds ranging from human-computer interaction to software development. Some were former military pilots as well.
Another difficulty was imagining what these future EVA missions would be like. Modern EVAs take place around earth orbit, not distant celestial bodies. Communications with Apollo astronauts on the Moon took a little over a second, and communications with our Mars rover (Curiosity) can take over ten minutes. As we venture further from Earth, we would likely need to have bases or habitats that would have some independence and autonomy. Without any of these existing, it was challenging to envision tasks and scenarios (beyond our science fiction).
Lastly, the cuff and head-mounted displays were limited in resolution (1280x720 for the former and 800x600 for the latter) making it hard to provide dense information without compromising legibility. We also did not have a hardware prototype to mimic a potential controls system, the prototype was ran on a laptop and used the mouse as the input device for interactivity.
Discovering user requirements
I knew that this project was very proof of concept and would be highly subjective to change based on feedback and future EVA mission details. I made sure I kept a changelog detailing differences between minor and major versions for future designers and developers. It was very exciting and intimidating to work on a project incorporating something that did not exist yet. Luckily some user requirements could be gathered from past interviews and a literature review. I learned about the impact of unknown environments causing cognitive strain, and the spacesuits' limited movement and maneuverability added physical constraints. There was also no technological equivalent to review, the closest being the twelve character display and a wrist booklet.
The astronauts hoped to be able to view all critical consumables at a glance, which becomes especially important nearing the end of an EVA, after prolonged mental workload and when low on resources. They wanted alerts only when needed, tasks and procedures can complicate and distract. Other status information was deemed important: "buddy" (EVA teammates) consumable levels, communications signal, objective markers, and graphs showing physiological exertion and consumable use.
The existing conceptual prototype addressed some of these aspects, but not all. In addition, the displays and controls team wanted to integrate their studies on consumable visualization methods. At the time, I did not know what an EVA actually entailed so I watched videos of EVAs on the International Space Station to see what actions are performed and what information is tracked.
Designing the prototypes
For this project, I used Axure for prototyping and Adobe Illustrator for the creation of any additional graphics. Although the focus was on the consumable information, I made sure to rearrange the layout to incorporate space for features such as: navigation, communications, media (audio/video recordings) and the task procedures. I wanted the layout to be predictable and consistent when swapping focus between those features. The design also needed to have a visual hierarchy with health information in the most visible location. For example, the consumables were placed in the top left because people tend to look there first.
For visual inspiration, I reviewed an interactive medium that has many comparable systems that this UI needs to inform: video games. Modern games have achieved high visual fidelity and complexity in the gameplay. Players need to track their health, consumables, navigate new environments, and complete objectives. While tending to these aspects, they may be experiencing pressure to succeed and intense scenarios involving an increased heart rate and high cognitive load.
I worked on making sure the UI was usable with each of the test devices: a cuff display and two head-mounted displays. They had fairly low resolutions by modern standards and different aspect ratios. I decided on having a more modular design, a series of stacking rectangular elements not unlike the concept of web responsive design. This would also allow sections to be modified or removed without too much change to the overall design. Other changes included implementing standards created for Orion (latest manned spacecraft): the coloring system, font size, sidebars, and buttons.
With our focus being on the consumables system, I started sketching out potential designs for displaying that information, keeping in mind both the rules that I set and the documented user needs. The current prototype's colors needed to be improved to show emergencies, a more detailed consumable breakdown needed to be available, along with their buddies' statuses.
Then, the sketches were created in Axure to have interactivity. The first version had a main content block, along with two sidebars that never changed in size. If more information needed to be displayed, it would be shown in the main content area. The Caution and Warning coloring system was implemented for the health and suit consumables (use of red = emergency, yellow = caution, white = safe) and font size was increased beyond system standardized values for legibility reasons. Other improvements were made to lower priority systems such as navigation and communications.
Usability testing
A user test protocol was drafted up and five human factors engineers gave feedback on the interface, completed tasks, and answered questions in ninety minute sessions. The tasks comprised of putting themselves into scenarios and finding relevant information from the interface. I wanted to use Axure's ability to program conditional responses to create a more immersive experience while testing, but the software's method for if/then statements and storing variables quickly became cumbersome. Having a developer program the prototype would be ideal for me, as the participants can then more realistically simulate a scenario where they can feel pressured by time to find key information.
Testing revealed that the design working on multiple devices was praised, but the iconography was shown to be confusing. It was not clear what was selected versus highlighted for importance. Other visual indicators for the "mini-map" and battery charging were not always interpreted correctly. It was definitely a challenge to create visual focus with the interface comprising of very rudimentary shapes and color, the look was very "analog". Screen resolution was also very limiting in what could be displayed at once.
Iterating on the design
The next iteration involved adding a new stick figure concept to the prototype and the font size was changed back to the Orion spacecraft standards. The stick figure concept was one of the visualizations the team was working on and wanted some user feedback on. The testing revealed that the figures were confusing, some thought they were suit issues localized to those limbs, when the team intended the figure's "pose" to represent shortage in a particular consumable. The decreased font size made it difficult to see out of the head-mounted displays.
The prototype went through a couple more versions that had quick feedback sessions with SMEs. The final version had the stick figure concept removed, with new icons, and the display of fewer elements at a time (one sidebar was removed entirely). Each one could be toggled to be hidden, and expanded on for more details. A couple of components of lower priority just became pop-ups, such as the communication status with other "buddies". A detailed report was drafted, documenting all the academic literature and technical documents used. Changes between each iteration was explained and accounted for and the testing process was clearly defined as well.
Lessons learned
The main takeaways to apply to future work.
Next steps for the project
In my ideal version of this project, I would love to have been able to meet the astronauts and test the prototype with them, the actual users. It was also difficult to moderate, observe behavior, take notes, and track quantitative goals like task completion percentage. I would have liked it to have more quantitative measures.
It would also be great if we sent astronauts to the moon and beyond, to have more data on the granular details from these EVAs. Some hypothetical tasks now would be more accurate in those cases. I hope that future versions will also have working hardware to reflect more realistic user input rather than using a mouse.
Empathy and teamwork are key
In the beginning it was overwhelming to learn about everything that needs to be done to be space ready. It encompasses many disciplines and large amounts of technical knowledge. I learned quite a bit from other interns originating from around the country, from textile artists to hardware engineers. It served as a reminder of how the field of human factors or user experience can vary greatly depending on what product we are trying to enhance. It helped open me up to many different perspectives that exist, and that it's impossible to know everything there is out there, and impossible to catch every single usability error.
Although it is very likely the final version of this user interface would look nothing like what I have produced, I hope I helped provide the design groundwork for a smooth transition to whoever would take over next, be it someone in UX or other field. -🌳