In maritime robotics, Unmanned Surface Vehicles (USVs) need automated methods for detecting, tracking, and classifying vessels and other on-water hazards. JPL Robotics is developing a Contact Detection and Analysis System (CDAS) that processes camera images (both visible and IR spectra) for such 360-degree maritime situational awareness. This capability is required to navigate safely among other vessels, supporting JPL’s autonomous motion planner that complies with the International Regulations for Preventing Collisions at Sea (COLREGS). Additionally, it supports mission operations such as automated target recognition (ATR) and intelligence, surveillance, and reconnaissance (ISR).
Building on JPL’s experience in building on-water perception systems since 2006, this task particularly focuses on challenging scenarios: low-visibility weather conditions, littoral and riverine environments with heavy clutter, higher sea states, high speed own-ship and contact motion, and semi-submerged hazards. The CDAS software will fuse input from both the JPL 360-degree camera head and the JPL Hammerhead stereo system for robust contact detection. Then, contacts must also be tracked to build an estimate of their velocity (for motion planning purposes) and classified by vessel type (many of which have COLREGS implications).
Michael Wolf - Jet Propulsion Laboratory
Office of Naval Research (ONR)