The Deep Space Habitat project was developed by NASA in 2010 as a platform to test new hardware, software and technologies. It was part of the Desert RATS (Desert Research and Technology Studies) analogs in the northern Arizona desert and lives on as HERA (Human Exploration Research Analog)

I was in charge of interfacing with the different components in the various subsystems (Power, Avionics, etc.) and integrating them into the common HaTS (Habitat Testbed Systems) architecture. Our goal was to integrate diverse hardware and software components with an emphasis on COTS (Commercial Off the Shelf) products.

It was with this project that I first began exploring “augmented reality” and how it could be used as a tool to assist crew members in diagnosing problems and interacting with the habitat.

Making a Complex System Accessible

  • The Deep Space Habitat consisted of a heterogeneous mix of off the shelf and custom components. The complexity of the system made fault tracing a complex task, even for the crew, let alone the system owners.

  • Strategically placed markers on the different segments of the habitat allowed us to show system components that would be otherwise hidden in the subfloors and walls.

  • Tapping into the system telemetry bus provided live data visualizations, useful in troubleshooting misbehaving components. The system connectivity was modeled in SysML and allowed us to trace both data and power paths.