Crop inspection is as essential today as it was at the birth of agriculture. Then and now, growers move among the fields to visually inspect plants, identifying and assessing the presence of infection and disease, deficiencies in water and nutrients, impact of environmental factors, the timing of harvests and predicting the expected yield.
However, human inspection is subject to the limitations of human attention, varying levels of skill and knowledge, and the biases of the person doing the inspection. Bloomfield built Flash to allow specialty crop growers to inexpensively, easily and continuously inspect their crops, plant-by-plant, to gain the benefits of close inspection while reducing and eliminating the deficiencies of human inspection. To date, Bloomfield has inspected grapes in 12 vineyards across 4 states, with plans to inspect vineyards in the EU and apple orchards in the US and UK.
Bloomfield will adapt its’ existing deep learning, camera-enabled, cloud-based plant assessment tool, Flash, to design and build a self-contained, computing at the edge variant for Controlled Environment Agriculture (CEA) to meet NASA’s needs that will determine:
(1) Plant stresses and/or infestation
(2) Plant performance against expectations
(3) Recommendations for improving plant health and performance in the unique and challenging environment of space
These will require the achievement of three objectives:
(1) Prove continuous, precise, accurate and reliable deep learning image-based plant health and performance assessment works in CEA
(2) Prove a novel multi-spectral imaging tool also capable of capturing precise images for deep learning analysis
(3) Prove a plant inspection tool using multi-spectral imaging under controlled lighting conditions capable of functioning reliably and continuously on the edge
This STTR provides Bloomfield with the opportunity to meet NASA’s needs while advancing Bloomfield’s product road map of providing value not just outdoors but also CEA.
Variations in ambient lighting make image-based AI difficult. But in CEA regulating the lighting solves this problem. Our camera will convert incident irradiant energy into discrete pixel values. Incident photons convert to electrical charges and quantize the analog signal to digital data. Utilizing LEDs in use on the ISS APH growth chamber for multi-spectral imaging is novel and a filter wheel on the camera will be synched with the in-situ growth LEDs and used as a flash to control the exposure and inspect for biotic and abiotic plant stress.
An acute need exists in specialty crops – which require frequent and intensive inspection – to objectively, precisely and consistently provide actionable reporting on the health of every plant in real-time. The deep network detection pipelines, edge processing capacity and novel camera system developed here will be ideally suited and commercialized in specialty crops grown in CEA settings.