Follow this link to skip to the main content
  NASA Logo
Jet Propulsion Laboratory
California Institute of Technology
+ View the NASA Portal
Search JPL
JPL Home Earth Solar System Stars and Galaxies Technology
JPL Robotics
Search Robotics
 
Home Page
Applications
Projects
Tasks
Groups
People
Systems
Facilities
Publications
Patents
Awards
News
Image Gallery
Video Gallery
Charter
Links
 
APPLICATION
underline
Application Page Publications Image Gallery Videos
Small Body Orbiting

The JPL Mobility and Robotic Systems Section has been engaged for some time in development of vision systems for small body (comet and asteroid) exploration. We are developing vision applications for both active (lidar and radar) and passive (visible and infrared) sensors. Our research and development activities cover a wide spectrum of applications including Autonomous Orbit Determination (AOD), Hovering Descent, Landing and Ascent (HDLA), Rendezvous and Station Keeping. The capabilities developed by the section can be summarized as follows:

Landmark Detection

Craters are abundant on many small planetary bodies (e.g. asteroids and comets) and are ideal natural landmarks. We have developed a sophisticated and robust crater-detection algorithm which has successfully detected craters in imagery of many different bodies (Figure 1). For non-cratered bodies, we have experimented with landmarks derived from local gradient information. Descriptors for these landmarks are computed and stored in a way that is invariant to image-plane rotation and some scale and lighting changes.

Fig. 1: Crater-detection examples.
Blue Line
Fig. 1: Crater-detection examples.
Blue Line

Landmark Identification

Reliable and efficient landmark identification between two images or between an image and a database is another active research area. We have successfully developed a suite of landmark-matching methods including cross-correlation matching, Fast Fourier Transform (FFT) matching, landmark-descriptor matching, and geometric-invariant matching. These methods taken together can satisfy requirements for most small-body-exploration activities including both onboard and ground operations.

Estimation of Spacecraft and Landmark Position

We have pushed the following technologies to fairly advanced levels: terrain relative-motion estimation using two images (monocular motion), bundle adjustment for estimation of both spacecraft position and landmark position, landmark-based pose estimation, and filtering of landmarks with inertial and attitude sensing for determining spacecraft trajectory (Figure 2). Many of the capabilities have been developed and tested using a gantry test bed in the Machine Vision Laboratory as well as real mission data such as that from NEAR.

Fig. 2: Landmark-based spacecraft-orbit determination. Left pane: Imagery with detected landmarks shown. Red = rejected as outlier. Green = accepted. Right pane: Recovered position overlaid on ground-truth trajectory. Vision-based position errors are shown for each frame.
Click here for a larger image
Blue Line
Fig. 2: Landmark-based spacecraft-orbit determination. Left pane: Imagery with detected landmarks shown. Red = rejected as outlier. Green = accepted. Right pane: Recovered position overlaid on ground-truth trajectory. Vision-based position errors are shown for each frame.
Blue Line

Surface Reconstruction and Hazard Detection and Avoidance

For spacecraft landing, the Section offers the following capabilities:

  • Structure from motion augmented with altimetry measurements for complete six-degree-of-freedom body-relative motion estimation.
  • Dense surface reconstruction using monocular imagery.
  • Landing-hazard detection (rocks, slopes, discontinuities, and roughness) (Figure 3).
  • Safe-site selection from terrain maps generated from imagery or scanning lidar.
  • Landing-site tracking for station keeping during sampling and precision landing.

Except for target tracking, these capabilities have been demonstrated on the JPL Autonomous Helicopter test bed by autonomously landing the helicopter in unknown and hazardous terrain.

Hazard (rock) detection result on a NEAR final-descent image.
Blue Line
Fig. 3: Hazard (rock) detection result on a NEAR final-descent image.
Blue Line





Privacy/Copyright Image Policy Glossary Sitemap Feedback Contact Us
  National Aeronautics and Space Administration website.