Deep space human exploration missions present a number of challenges. The distance from Earth makes communication less reliable and mission management more complex, and places a greater burden on human crews. Managing the complexity of the various onboard systems, processes, and resources, including health systems, payloads, etc., will present new kinds of crew challenges and stresses not experienced in Earth orbit where the ground station manages much of the mission. Autonomous cognitive agents that act as “virtual assistants” could interact with the crew and with the onboard systems to help with tasks that would be too burdensome or time-consuming for the crew alone. Cognitive agents based on modular, extensible cognitive architectures are needed to enable effective interaction, reasoning, problem solving, and teaming with human crews. In Phase I, we explored different use cases and design concepts, developed designs and simple prototypes, and conducted an initial feasibility assessment. Based on our findings in Phase I, SoarTech proposes in Phase II to develop a comprehensive working prototype cognitive architecture-based virtual assistant to support human exploration in deep space, and to demonstrate it in a representative environment. In performing this Phase II work, we will leverage our team’s considerable background in cognitive architectures, interactive systems, cognitive systems engineering, user-centered design, and space operations. SoarTech has been researching, developing, evaluating, and integrating interactive cognitive systems for the past 20+ years, including the design and use of cognitive architectures to develop multi-modal interfaces, synthetic teammates, and cognitive agents that allow for natural and intuitive interaction with computing systems. Our two astronaut subject matter experts have a combined 438 days of spaceflight time over five missions on the ISS and space shuttles, including multiple EVAs.
As NASA moves toward more independent astronaut crews and to deep space missions, the Autonomous Virtual Assistant (AVA) will help astronauts perform tasks, diagnose problems, and brainstorm solutions without help from ground teams. AVA could serve on board the Orion and Lunar Gateway as well as on the ISS and bases on the Moon or Mars. AVA could support ground teams performing complex tasks, terrestrial NASA researchers doing data analysis and experiment design, or help astronauts train or refresh on specific systems or procedures.
Defense applications include helping to operate complex automated weapon systems, or helping in complex ISR tasks across the services. Civilian uses include virtual assistants in power and manufacturing plants to help manage, monitor, and analyze operations. Medical teams need tools that can be used to query data (e.g., medical records), to support diagnosis and for treatment assessment.