Skip Navigation

Research Tasks

Learning Perception for Autonomous Navigation

Learning Perception for Autonomous Navigation
The Future Combat System (FCS) program clearly demonstrates the Armys vision for robotics in the future force. However, current robots have serious limitations that obstruct this vision. Their 3-D perception is myopic, limited to on the order of 10 meters. They are unable to use other perceptual cues such as color and texture to recognize obstacles and safe corridors beyond the range of their 3-D sensors. Furthermore, they are unable to learn from their own driving experience about what terrain is traversable and what is not; hence, all such distinctions must be pre-programmed, which is effectively infeasible. These factors severely limit the driving speed of robots, cause them to drive into cul-de-sacs, and make them unable to recognize that they can drive through vegetation. For the end user, this translates into unacceptably low mission effectiveness, low survivability, and high training requirements for robots.

JPL proposes several coordinated research thrusts to overcome these limitations.
These will increase the speed and range of 3-D perception, exploit stereo vision and proprioception to learn from experience in navigation, and embed prospects for learning in path planning.

This work is funded by the DARPA, Information Processing Technology Office, Learning Applied to Ground Robots Program.
Point of Contact: Larry Matthies
Sponsored By: Defense

People on this Task

Max Bajracharya
Larry Matthies