Skip to main content

NASA's Mars rover is really good at laser-blasting rocks without human input

NASA's Mars rover is really good at laser-blasting rocks without human input

/

Curiosity has been studying Mars on its own for a year

Share this story

Photo: NASA / JPL-Caltech / MSSS

For the last year, the Curiosity rover has been studying the surface of Mars with more independence than ever before, saving human time and energy. The partly autonomous exploration is also helping people sidestep the constraints of working across vast distances in space.

This new capability is powered by software called Autonomous Exploration for Gathering Increased Science, or AEGIS. It allows the rover to control its own use of ChemCam, an instrument that learns the chemical composition of rocks by zapping them with a laser and studying the resulting gasses. Between the new software’s deployment in May 2016 and April 2017, the rover did this 52 times after moving to a new location.

Combined with the observations controlled by NASA scientists, the automation has helped increase the average number of laser firings from 256 per day to 327 per day. More laser firings means more data collected, and that means NASA gets a better understanding of what Mars is like, and — more importantly — what it used to be like.

Any time Curiosity rolls into a new area of Mars, even if it’s just a few feet from its last location, AEGIS can autonomously scan the environment using the rover’s cameras. There, it identifies and ranks the best patches of bedrock to study with ChemCam. AEGIS then triggers that laser and performs those measurements. NASA announced the initiative last summer, and a paper published today in the journal Science Robotics details how well it’s gone.

Curiosity doing science on its own means NASA gets more data; more data means NASA learns more about Mars

Samples of AEGIS target selection in various scenes. Green areas represent targets that were identified as top priority, orange shows secondary targets, red were retained as possible targets, and blue shows rejected targets.
Samples of AEGIS target selection in various scenes. Green areas represent targets that were identified as top priority, orange shows secondary targets, red were retained as possible targets, and blue shows rejected targets.
Photo: Francis et al., Sci. Robot. 2, eaan4582 (2017)

Yes, while scientists sleep on Earth, their robot on Mars is now doing some of their work for them. It’s helping lift a massive burden, too. Curiosity’s ChemCam laser has fired more than 440,000 times at around 1,500 targets since it landed on the Red Planet in August 2012, according to Raymond Francis, the study’s lead author. Before AEGIS, almost every one of those targets had to be selected back on Earth.

That’s an especially laborious process, because the science team is working with a robot that’s always about 150 million miles away. It can sometimes take up to 20 minutes for a signal to get to, or from, Mars. The Earth’s constant rotation also means that Mars isn’t always in view.

AEGIS allows NASA to work around this problem in an entirely new way. The program was originally written for and used on the Mars Opportunity rover, but was adapted for Curiosity two years ago. The 21,000 lines of code that make up AEGIS were added to the nearly 4 million that make up Curiosity rover’s flight software in late 2015, and after months of testing, scientists started using it in May last year.

When Curiosity’s operators send the rover its commands for a day of driving, they now include AEGIS targeting sessions in those plans more than half the time. AEGIS is especially useful on those driving days because the rover can scan and study the best targets in its new “workspace,” Francis says in an interview with The Verge. And when this happens, the science team has new data to look at by the time they’re awake and talking to the rover.

Curiosity has fired its laser at Mars rocks almost half a million times since 2012

“You've got all this science time after [each] drive, and often you have a few hours of [Martian] daylight left, but Earth has not yet seen this new place that the rover is in,” Francis says. “And there's no ability for people on the Earth to make decisions about what to target. That decision has to be made on Mars, and now we can make it on Mars. So that makes use of those hours that otherwise you wouldn't have been able to do these kinds of measurements.”

Before AEGIS, rover operators only had a few options at maximizing time for science on driving days. They could do more science with ChemCam in the morning, but that meant driving later in the day, which often means using more of the rover’s energy to keep itself warm. The other option was what’s known as “blind targeting,” where the science team would tell the rover to shoot its laser at a specific angle without having visual confirmation of what was there. This blind firing would only hit the targets the science team was looking for about 24 percent of the time — better than nothing, but not great.

An animation that shows Curiosity using ChemCam to study different targets on Mars.
An animation that shows Curiosity using ChemCam to study different targets on Mars.
GIF: NASA/JPL-Caltech

By contrast, AEGIS has proven to be 93 percent accurate at finding the types of rock the science team is looking for. The software was also built in such a way that the science team can outfit AEGIS with different “target profiles,” which will allow Curiosity to look for different kinds of rocks as the robot rolls into new, unexplored Martian territory.

AEGIS is also helping the human operators back on Earth by using algorithms to refine their targeting of smaller features, like narrow veins of rock. So far, though, its main use has been to let Curiosity do its own scientific exploration. And it’s doing well enough that it’s shaping future missions. AEGIS is already being worked into Curiosity’s successor, the Mars 2020 rover, according to Francis.

“2020 is a very ambitious mission with a long drive list of places that it's going to have to go, and distances it's going to have to cover, and samples to take. And we expect that, as a result, faster work on board and more autonomous science is probably going to be a big part of how we do that,” he says.

Letting these robots do more tasks on their own could change how we approach space exploration

Francis thinks this is just the beginning of letting robots do more of the work, especially beyond Mars. “If you're flying by an asteroid, or a comet, or if you're near Saturn, and Enceladus has got a plume of water coming out of the undersurface ocean, Earth might not know exactly where that thing is going to be,” he says. “The spacecraft has to be able to react to that on its own.”

He also says autonomous software like AEGIS would be helpful for missions to extremely hostile worlds, like Venus. “The only landers that have gone there have had minutes of lifetime, tens of minutes, and so you don’t have a lot of time for cycling with Earth in the loop,” he says. AEGIS could be a solution to that. He argues that a lander equipped with AEGIS could quickly suss out the most important targets, study them, and get scientific data back to Earth before the spacecraft is ruined by the planet’s immense heat.

But while it’s tempting to imagine NASA sending fleets of robots out into the Solar System that are all capable of doing their own science, Francis says AEGIS won’t be replacing human scientists anytime soon. It’s a tool for the science team, not their replacement.

“We certainly don't have a long-term goal of replacing the scientists, because this is a science and exploration mission, and it won't get far without its science team,” he says. “AEGIS is making use of that time that otherwise couldn't have been used.”