Follow this link to skip to the main content
  NASA Logo
Jet Propulsion Laboratory
California Institute of Technology
+ View the NASA Portal
Search JPL
JPL Home Earth Solar System Stars and Galaxies Technology
JPL Robotics
Search Robotics
 
Home Page
Applications
Projects
Tasks
Groups
People
Systems
Facilities
Publications
Patents
Awards
News
Image Gallery
Video Gallery
Charter
Links
 
APPLICATION
underline
Application Page Publications Image Gallery Videos
Terrestrial and Military Robotics

Fig. 1: The 'Urbie' urban robot during vision-guided, autonomous stair climbing.
Click here for a larger image
Blue Line
Fig. 1: The "Urbie" urban robot during vision-guided, autonomous stair climbing.
Blue Line
The Mobility and Robotics Section is very active in research for non-NASA sponsors on a variety of topics, including perception for autonomous navigation of unmanned ground, air, and sea surface vehicles (UGVs, UAVs, and USSVs), as well as object recognition from ground and overhead vantage points to serve a variety of applications.

Perception research for autonomous UGVs addresses real-time 3-D perception, multi-sensor terrain classification, and learning from experience to improve navigation performance. JPL pioneered the development of real-time stereo vision for 3-D perception for off-road navigation, continues to improve algorithms for this function, and pursues custom hardware implementations of stereo vision for compact, low-power, high-speed vision systems. Recent achievements in this area include FPGA-based stereo processing for sub-millisecond computation of range images with low power consumption. In addition to stereo vision with visible-spectrum cameras for daylight operation, JPL has demonstrated autonomous navigation at night using thermal-infrared stereo cameras and addressed the ability to see in 3-D through a variety of atmospheric obscurants using thermal-infrared cameras. JPL has also developed small, two-axis scanning laser range finders for man-portable UGVs and conducts research on flash ladar that is applicable to standard UGVs. In addition, JPL is active in urban robot navigation, for which we have developed the first vision-and-control system for autonomous robot stair climbing (figure 1), and improved vision algorithms for detecting and tracking people around robots for safety reasons (figure 2).

Fig. 2: Demonstration of moving person detection. Left: input image. Right: input image labeled with detected moving person.
Click here for a larger image
Blue Line
Fig. 2: Demonstration of moving person detection. Left: input image. Right: input image labeled with detected moving person.
Blue Line

Multi-sensor terrain classification is essential in off-road UGV navigation to determine the traversability of complex terrain that can include potholes, ditches, bodies of water, vegetation, and a variety of man-made obstacles. Potholes, ditches, and related depressions, collectively called "negative obstacles," are particularly difficult to detect from ground level because of the shallow view angle. JPL developed a partial solution to this problem by showing that negative obstacles tend to stay warmer than surrounding terrain at night, so they have detectable signatures in thermal-infrared imagery at night. Water bodies are also important navigation hazards, and are difficult to detect because range sensors on UGVs generally are unable to measure distance to the water surface. JPL is developing algorithms that integrate several cues for reliable water detection (including reflection characteristics and thermal properties) and modeling ladar beam propagation through water (figure 3). Ladar sensing is also being used by JPL to recognize vegetation and assess its traversability. We have pioneered the use of 3-D scatter analysis of ladar data to reason about material density for this purpose. JPL also has applied multispectral classification to this problem, with visible and near-infrared imagery, texture classification with Gabor filters, and multispectral thermal-infrared classification.

Fig. 3: Demonstration of water detection in natural terrain. Left: pond reflecting sky and terrain. Right: detected water regions.
Click here for a larger image
Blue Line
Fig. 3: Demonstration of water detection in natural terrain. Left: pond reflecting sky and terrain. Right: detected water regions.
Blue Line

For small, man-portable UAVs, JPL is developing new hardware and algorithms for a number of uses. To support the identification and tracking of safe landing sites, we developed a 2x2-inch smart camera with a CMOS imager, digital signal processor, memory, and I/O. To create 3-D models of objects on the ground from aerial sensing, we are developing and implementing structure from motion algorithms. Similar research on object recognition has included detection of unexploded ordnance from UGVs during test-range clean up, and current efforts focus on overhead-reconnaissance applications.





Privacy/Copyright Image Policy Glossary Sitemap Feedback Contact Us
  National Aeronautics and Space Administration website.