doctoral dissertation, tech. report CMU-RI-TR-16-57, Robotics Institute, Carnegie Mellon University, October, 2016
|The capability and mobility of exploration robots is increasing rapidly, yet missions will always be constrained by one main resource: time. Time limits the number of samples a robot can collect, sites it can analyze, and the availability of human oversight, so it is imperative the robot is able to make intelligent actions when it comes to choosing when, where, and what to sample, a process known as adaptive sampling.
Current Mars rover operations give an example of the need for adaptive sampling techniques. Daily plans budget specific operations down to the minute, and each operation is given a maximum of a few hours to complete. However, the capabilities of the rover far exceed the limited plans. For example, the Mars 2020 mission is testing the Planetary Instrument for X-Ray Lithochemistry (PIXL), a spectrometer used to analyze rocks on a microscopic scale. A quick scan of a postage stamp-sized surface can take an hour but is very noisy, yet reducing this noise with a full scan of the surface takes multiple days. Scheduling constraints and communication delay prohibits scientists from reviewing data in real-time and they are thus unable to follow up at any promising sample locations. However, adaptive sampling techniques enable the rover to quickly and autonomously choose a subset of interesting points for detailed follow-up scans.
This work advances the state of the art in adaptive sampling for exploration robotics. We take advantage of the fact that rover operations are typically not performed in a vacuum; extensive contextual data is often present, most often in the form of orbital imagery, rover camera images, and quick microscopic scans like those described above. Using this context, we apply advanced Bayesian and nonparametric models to decide where best to sample under a limited budget. Unlike previous works, our approaches reduce the impact of noise on sample site selection, a common problem when using contextual data. The thesis evaluates our methods in three main scenarios. We begin with the general case, in which noisy contextual information of an entire scene is available to the rover. The rover must choose sampling locations that are expected to create a maximally diverse sample set or a sample set that best represents the entire scene. Sampling a point reveals the true underlying data, altering the reward of future points.
Next, we demonstrate improvements to adaptive sampling techniques performed at the microscopic scale, as if the rover stopped for a detailed inspection. Mimicking PIXL operations, the rover no longer has full contextual information, but instead collects a quick scan of each point, one at a time, and decides whether or not to perform a full scan at that point before moving on. We demonstrate a Dirichlet-based technique for building a classification of samples, and show improved performance over existing sampling methods.
Finally, we consider rover operations the orbital level, demonstrating improvements to rover path selection using satellite maps as contextual data. Here we apply our Bayesian models to maximize the expected yield of samples collected across a number of prospective sampling paths, ultimately choosing the path most likely to maximize our understanding of the terrain.
|Adaptive sampling, science autonomy, planetary robotics, exploration robotics|
Associated Center(s) / Consortia:
Field Robotics Center
Associated Project(s): Life in the Atacama
|Greydon Foil, "Efficiently Sampling from Underlying Physical Models," doctoral dissertation, tech. report CMU-RI-TR-16-57, Robotics Institute, Carnegie Mellon University, October, 2016|
author = "Greydon Foil",
title = "Efficiently Sampling from Underlying Physical Models",
booktitle = "",
school = "Robotics Institute, Carnegie Mellon University",
month = "October",
year = "2016",
address= "Pittsburgh, PA",
|The Robotics Institute is part of the School of Computer Science, Carnegie Mellon University.|
Contact Us | Update Instructions