Addressing the challenges associated with dramatic increase in the complexity of the National Airspace System (NAS) has required the introduction of autonomous capabilities to maintain efficiency and safety. However, as increasingly autonomous (IA) capabilities and systems are introduced into existing human-centric environments, the roles and responsibilities of humans change, especially when working in collaborative environments, such as the Airport Operations Area (AOA) and Urban Air Mobility (UAM). The integration of autonomous capabilities into traditionally human-centric environments with the goal of Human-Automation Interaction and Teaming (HAIT) makes it difficult for IA systems to not be brittle (i.e., working well in the lab under nominal circumstance, but perform poorly in unexpected situations) and accident-prone (i.e. account for emergent behavior due to unexpected decision by people or environment changes) as they attempt to work collaboratively with people. Therefore, we propose the Virtual EnviRonment for InFormative analYsis (VERIFY) framework, which links physical spaces to a virtual environment (i.e. mixed-reality). VERIFY will be used as a research tool for proactively understanding how humans and IA systems will need to work collaboratively to address and mitigate system hazards and unexpected events as a HAIT. By using a mixed-reality environment, researchers can explore multiple environmental variables simultaneously, to understand their impact on individual tasks across multiple HAIT arrangements. The objective is to leverage use cases to define key characteristics for probabilistic scenario generation to define HAIT test cases for evaluation within a mixed-reality environment. This approach also enables engineers to safely employ both physical and virtual hazards for training adaptive and non-deterministic systems and human operators to work alongside each other under nominal and off-nominal conditions.
A variety of NASA technologies and missions could benefit from this effort, e.g, ISS robots: Astrobee and R2, OSAM system Restore-L, the in-Space Assembled Telescope, the Lunar Surface Science Mobility System, Commercial Lunar Payload Services, and future manned Mars missions. There are also research topics within NASA’s Human Research Program that are funding human-automation interaction and teaming topics. Finally, there are various STMD technology demonstrations that would also benefit.
The designs and techniques developed under this project will have direct application to human-automation interaction and teaming efforts with TRACLabs DOD customers, e.g. the Air Force Space and Missile Systems Center, U.S. Army TARDEC, and the Army Futures Command. Additional customers integrating IA systems into human-centric environments include automotive, and oil & gas manufactures.