On-board hazard detection is critical to the success of landed missions, as available orbiter data does not capture the lunar terrain at a resolution that enables identification of potentially mission-threatening rocks and craters on the centimeter-scale. Current state-of-the-art technologies in hazard detection typically use LiDAR data to address low/variable illumination conditions during landing operations, however the option to include image data can result in a hazard detection solution that is more frequently updated at a higher resolution. The proposed work applies a deep learning approach to this problem, as the highly parallelizable nature of learning-based computations naturally extends to hardware acceleration, enabling additional computational power to compute and combine hazard maps across both LiDAR and camera data. The output of this development will be a demonstration of the feasibility and performance of a deep-learning based hazard detection system that leverages both LiDAR and image data to achieve mission-speed performance on path-to-flight hardware.
The proposing team is currently developing a LiDAR-based hazard detection module for Astrobotic’s Griffin Mission One to deliver NASA’s VIPER rover to the lunar south pole, planned for late 2023. Techniques developed in the proposed work will benefit from the V&V infrastructure developed for this and future missions. Additionally, Astrobotic will leverage the LunaRay Suite, which is capable of generating and verifying accurate terrain data, including terrain models, photometrically accurate image data, as well as simulated LiDAR data at input locations, times, and viewing positions. As such, a large and widely varied training dataset will be produced, enabling the training of a robust network. By providing a robustly trained solution on relevant hardware, the proposing team seeks to drive forward the market of applied deep learning technologies in the space industry.
As landing precision requirements continue to grow with increasingly complex mission scenarios, customers will look to a flexible solution which utilizes as much data as possible to produce an accurate solution. Astrobotic’s own participation in NASA’s CLPS program will provide an internal customer enabling demonstration of this technology on a landed mission. With flight heritage and demonstrated successes, this system will become a sensor considered as a strong option for future missions through the CLPS and Artemis programs.
The ability for an airborne system to track objects in real-time may be of interest to the DOD to gain intelligence and ensure troop safety in uncertain environments. The DOD may be interested in a hazard detection system for missions landing in uncertain areas as well. Hardware acceleration for deep learning applications would find a host of applications, such as in the autonomous vehicle sector.