Objective

This project will utilize National Aeronautics and Space Administration's (NASA) groundbreaking airborne fluid lensing and Multispectral, Imaging, Detection, and Active Reflectance Instrument (MiDAR) technologies, invented by project Principal Investigator, Ved Chirayath, for NASA’s Earth & Planetary Science applications, along with NASA’s Neural Multimodal Observation and Training Network for Global Fluid Lensing Marine Habitat Mapping (NeMO-Net) convolutional neural network, for the automated detection, localization, and characterization of underwater military munitions.

Technical Approach

Fluid lensing remains the highest-resolution aquatic remote sensing technology in the world, providing distortion-free sub-cm three-dimensional (3D) images through breaking waves up to depths of 20m. Fluid lensing and the NASA FluidCam instrument have been used to enable next-generation conservation solutions for coral reefs, among other critically endangered shallow marine systems. To date, dozens of global airborne campaigns have validated fluid lensing’s ability to image through breaking waves and refractive and reflective wave distortions to create detailed bathymetric and 3D color images of littoral zones, including contiguous terrestrial to off-shore mapping up to depths of 20m.

MiDAR was awarded NASA’s 2019 Invention of the Year and is an active multispectral imaging, detection, and optical communications instrument that uses fluid lensing to image objects at depth in 13 active spectral bands, extending fluid lensing into the ultraviolet bands and spanning the entire visible spectrum. MiDAR is currently being used as part of NASA and National Geographic grants to detect anthropogenic marine debris and, with active ultra violet sources, is able to detect biofouled and otherwise well-camouflaged materials of abiotic origin.

The project will use fluid lensing and MiDAR to image submerged munitions in 3D in multiple spectral bands, without wave distortion and through challenging environments such as wave breaks and surf. The project team plans to use the state-of-the-art NASA NeMO-Net neural network to automatically detect, localize, and characterize underwater munitions and scale for rapid assessment.

NASA NeMO-Net, also invented by Ved Chirayath, is an ongoing global marine habitat mapping machine learning software that runs on NASA’s supercomputer. NeMO-Net uses a Fully Convolutional Neural Network to perform semantic segmentation and masking of remote sensing imagery from drones, manned aircraft, and satellites, including FluidCam, MiDAR, and WorldView. Deep Laplacian Pyramid Super-Resolution Networks alongside Domain Adversarial Neural Networks are used to augment low resolution imagery with high resolution fluid lensing and MiDAR datasets as well as recognize domain-invariant features across multiple instruments. NeMO-Net has the highest classification accuracies for benthic habitat mapping to date and is already being augmented to detect anthropogenic debris, an ideal application for submerged munitions.

Benefits

This experimental effort is anticipated to validate the ability of these novel technologies to detect submerged munitions in a cost-effective and rapid way, using unmanned aircraft to image large areas and machine learning to rapidly process data. Fluid lensing’s ability to image through wave breaks and surf significantly improves domain awareness in this field and would be a leap forward in mitigating the threat of submerged munitions.

(Anticipated Project Completion - 2025).