The annual impact of wildfires is enormous and is likely to worsen due to climate change. Mapping the destruction wrought by these fires offers insight that can aid fire and forest management, climate and carbon cycle research, and even aid private and corporate risk-tolerance analysis. Satellite imagery of the Earth is captured regularly, providing a vast wealth of detailed information that greatly benefits from automated analysis and has particular utility for burned area detection and mapping. Existing state-of-the-art burned area algorithms are accurate but slow, involving region-specific models and human-in-the-loop quality assessments.
Meanwhile, a powerful array of tools and techniques have been developed in the deep-learning community for semantic segmentation and convolutional neural networks (CNNs) have become increasingly capable. However, benchmark datasets are not generally composed of remote sensing imagery and there are important considerations to adapting state-of-the-art techniques to this data.
To address this need and fully leverage the available data, Toyon will implement a core Feature Extraction Module (FEM) that incorporates a modern suite of best practices in CNN architecture and design, transfer learning, and remote sensing imagery analysis. Additionally, Toyon will implement 2 enhancement modules: A pre-processing Super-Resolution Module (SRM) and a post-processing Contextual Representation Module (CRM). The SRM is composed of a CNN that produces high-resolution auxiliary outputs for human viewing but is trained with the core segmentation model to enhance the latter’s performance. The CRM uses candidate semantic segmentations to refine and augment the FEM-derived features. Both enhancement modules are entirely general and can be used toward a variety of other remote sensing tasks. Toyon will combine the 3 modules to produce a sophisticated burned-area detector that can automatically process satellite imagery on an ongoing basis.
Potential NASA applications include burned area detection and mapping but also extend to satellite and remote sensing imagery analysis more generally. Specifically, any semantic segmentation model can be immediately enhanced by applying our techniques and modules. Moreover, other downstream tasks related to remote sensing analysis can benefit from the task-tailored super-resolution images produced SRM and the context incorporation method exemplified by our CRM.
Non-NASA researchers in climate and carbon science would benefit from our burned area detection tool. Researchers in remote sensing more broadly and anyone trying to draw insights from satellite imagery would benefit from incorporating our advanced enhancement modules into their models. This could have particular impact in the insurance and real-estate industries.