SBIR/STTR Award attributes
To protect marine mammals from ship strikes and sonar exposure, human watchstanders maintain a tedious and costly visual lookout, which is impossible on unmanned vessels like Sea Hunter. Automated video search avoids human fatigue while detecting surfaced marine mammals, and automated acoustic search detects mammals below the surface, but both approaches have limitations: cameras are affected by operating conditions, and hydrophones provide unreliable localizations and only work when mammals vocalize. Nautical Evaluation of Mammal Observations (NEMO) addresses these challenges by fusing available sensors to optimize surface (video) and subsurface (acoustic) detection, classification, and localization in environments from cold or dark (IR) to warm or bright (EO). NEMO is sensor agnostic – the Vision Module uses state-of-the-art saliency, anomaly, and data-driven deep learning appearance detection to produce reliable, real-time mammal detections in EO and IR cameras. NEMO integrates acoustic sensing, developed with our partners at Raytheon for their MS3 sonar, which is already onboard Sea Hunter. Each sensor feed is probabilistically weighted by the Fusion Module based on operating conditions and encoded domain knowledge from our whale expert partners at New England Aquarium. Fused detections enable external autonomy to select actions that balance the conservation of marine mammals with mission objectives.