Follow this link to skip to the main content
  NASA Logo
Jet Propulsion Laboratory
California Institute of Technology
+ View the NASA Portal
Search JPL
JPL Home Earth Solar System Stars and Galaxies Technology
JPL Robotics
Search Robotics
Home Page
Image Gallery
Video Gallery
Project Page Publications Image Gallery Videos
Spirit and Opportunity

The Mobility and Robotic Systems section has participated extensively in the Mars Exploration Rovers (MER) project. Areas of contribution include landing, rover driving, instrument placement with the rover's arm, and ground control.

Entry, Descent, Landing (EDL)

Fig. 1: Matched image sequence from MER EDL. Right Shadow
Bottom Shadow
Click here for larger image.
Blue Line
Fig. 1: Matched image sequence from MER EDL.
Blue Line
The MER Descent Image Motion Estimation System (DIMES) is the first passive image-based system to estimate lander velocity during planetary descent. The DIMES algorithm combines sensor data from a descent imager, a radar altimeter and an inertial measurement unit in a novel way to create an autonomous, low-cost, robust and computationally efficient solution to the horizontal-velocity-estimation problem. DIMES performed successfully during both of the MER landings. During the landing in Gusev Crater, the landing system used the measurement provided by DIMES to remove a possibly catastrophic horizontal velocity.

Surface Navigation

Fig. 2: Rover tracks on Mars.
Click here for larger image.
Blue Line
Fig. 2: Rover tracks on Mars.
Blue Line
The Mars Exploration Rovers have traversed more than 10 kilometers over the Martian surface, using a combination of vision-enhanced autonomous drives and simple directed moves in which operators tell them where to go and motions are executed without taking pictures. When driving autonomously, MER vehicles use cameras to look ahead and avoid potential problems using Stereo Vision and GESTALT (Grid-based Estimation of Surface Traversability Applied to Local Terrain). Also, the imagery can be used with Visual Odometry software, to track features in the terrain and measure how much the rover has slipped during drives on soft or highly sloped terrain. Using these machine-vision technologies, each MER vehicle drives more safely, covers larger distances per day, and reaches further science targets. Roughly 30% of the distance driven by MER vehicles in their first 18 months was accomplished using these autonomous robotics technologies.

Manipulating Arm

Fig. 3: The MER Instrument Deployment Device during calibration and testing at The Kennedy Space Center, prior to launch.
Click here for larger image.
Blue Line
Fig. 3: The MER Instrument Deployment Device during calibration and testing at The Kennedy Space Center, prior to launch.
Blue Line
The MERs carry a unique in situ instrument suite that has been designed to measure and understand the detailed geochemistry and morphology of the Martian surface. The suite includes the Mössbauer Spectrometer (MB), the Alpha Particle X-ray Spectrometer (APXS), the Microscopic Imager (MI), and the Rock-Abrasion Tool (RAT). The deployment and placement of these instruments onto the Martian surface (both soil and rock targets) is controlled by the Instrument Deployment Device (IDD). With 5 degrees of freedom, the IDD represents the most dexterous robotic manipulator ever flown to another planetary surface. It is mounted towards the front of the rover and is capable of reaching out approximately 0.8 meters in front of the rover at full extent. The IDD weighs approximately 4 kg and carries a 2-kg-payload mass (instruments and associated structure).

During rover-driving activities, the IDD is contained within a stowed volume that does not impact the rover's ability to traverse safely across the Martian terrain. Targeting for the placement of the in situ instruments on rock and soil targets is carried out using the front hazard-avoidance cameras (or front Hazcams) which are configured as a stereo camera pair. Onboard software controls the IDD based on sequences developed by ground operators. This software contains numerous low-level and high-level functions such as actuator current limiting based on temperature and pose, inverse kinematic Cartesian control, deflection compensation due to gravity and tilt-induced droop, model-based preloading of instruments on hard targets, and instrument placement using proximity feedback sensors.


Fig. 4: RSVP 3D graphical display of arm motion sequence.
Click here for larger image.
Blue Line
Fig. 4: RSVP 3D graphical display of arm motion sequence.
Blue Line
RSVP (Rover Sequencing and Visualization Program) is used for the final generation of command sequences for the mission and the planning and validation of all mobility and manipulation activities. It is a suite consisting of two main components: RSVP-ROSE (Rover Sequence Editor) and RSVP-Hyperdrive. ROSE provides mission operations engineers with an intuitive interface for editing sequences of commands and modeling them using the multi-mission tool SEQGEN. In addition to constructing sequences for surface operations, ROSE was used to build the sequences that guided the spacecraft during the cruise phase of the mission. Hyperdrive is an immersive 3D simulation of the rover and its environment that enables operators to construct detailed rover motions and verify their safety. RSVP also provides advanced tools for visualizing telemetry received from the spacecraft and assessing the performance of past sequences.

Maestro (a.k.a. the Science Activity Planner) is the science visualization and planning tool for the MER Mission. On each sol (Martian day) of operations, Maestro is used to analyze the data arriving from the Spirit and Opportunity rovers and construct a plan of activities for the rovers to execute on the next sol. This plan is refined by additional tools before it is transmitted to the spacecraft. Maestro is also used as the operations interface for research rovers in development at JPL and was released as a public engagement tool for the Mars Exploration Rover mission. Maestro's successes in mission operations, technology development, and public engagement earned it NASA's Software of the Year Award in 2004. A screenshot from the current version of Maestro is shown below.

Fig. 5: Maestro display of science imagery.
Click here for larger image.
Blue Line
Fig. 5: Maestro display of science imagery.
Blue Line

People on This Project
Khaled Ali
Paul Backes
Paolo Bellutta
Chuck Bergh
Jeffrey Biesiadecki
Joseph Carsten
Yang Cheng
Brian Cooper
Anthony Ganino
Michael Garrett
Matthew Heverly
Andrew Johnson
Brett Kennedy
John Koehler
Todd Litwin
Mark Maimone
Michael McHenry
Joseph Melko
Tam Nguyen
Steve Peters
Arturo Rankin
Allen Sirota
Ashley Stroupe
Olivier Toupet
Julie Townsend
Ashitey Trebi-Ollennu
Vandi Verma
Richard Volpe
Reg Willson
Jeng Yen

Privacy/Copyright Image Policy Glossary Sitemap Feedback Contact Us
  National Aeronautics and Space Administration website.