Our eyes on Mars: How Curiosity sees the Red Planet

(CNN)NASA’s Curiosity rover sits on Mars with a view that no human has ever beheld in person. But the cameras that act as its eyes capture the planet’s desolate, craggy, red-washed vistas, and relay these scenes back to Earth in a stream of images each day. But Curiosity’s photography skills aren’t just a window into…

(CNN)NASA’s Curiosity rover sits on Mars with a view that no human has ever beheld in person.

But the cameras that act as its eyes capture the planet’s desolate, craggy, red-washed vistas, and relay these scenes back to Earth in a stream of images each day.
But Curiosity’s photography skills aren’t just a window into another world — the images, curated by a diligent, little-known team of scientists, deliver crucial information that informs future science.
The rover is larger than you might think—about seven feet tall and the size of a small SUV. If you study the face closely, you’ll notice its mismatched eyes.
Both are fixed focal lengths with no zoom capability, in focus starting from about six feet ahead of the rover. The left eye is a 34 mm lens, while the right is a 100 mm telephoto.
The Mastcam, comprised of these two lenses, can capture color images and video, which can be stitched together for Curiosity’s iconic selfies and panoramas. In early March, NASA shared a stunning panorama comprised of 1.8 billion pixels’ worth of images taken over the Thanksgiving holiday.
It’s like a large version of Disney’s WALL-E, an endearing explorer sending back postcards to humanity on a daily basis.
The data, carried by radio waves, pings off orbiters around Mars and reaches NASA’s Deep Space Network of antennas around the world.
Image processing specialists at NASA’s Jet Propulsion Laboratory are the first people in the world who get access to the photos.
They quickly process them, using the first digital image processing software actually developed by JPL in 1966.
Photoshop doesn’t know how to handle the images sent back from Mars — the photo editing software used by photographers would stitch images together and create a jumbled picture of Mars, completely out of order.
But image processing specialists like Hallie Gengl know exactly what to do.
Each image comes embedded with 100 lines of data, which Gengl and others on her team use to figure out its orientation in the larger mosaic they’re assembling. Like individual paint pigments in a portrait, the images are quickly assembled, sent around to the rover team scientists and placed online for the public to view, including raw images, she said.
The images don’t receive much editing. The main concern is making sure they’re oriented the right way, especially when fitting them together in mosaics.
Without the images collected by Mastcam —- as well as the black-and-white Navcams for navigation beneath them —- the rover would sit still on Mars.
This is because Curiosity isn’t autonomous. Instead, teams on Earth send commands to the rover. And without images, the drivers wouldn’t be able to tell Curiosity where to go. Curiosity doesn’t move unless it’s safe to proceed.
It’s those images that has enabled it to travel more than 13 miles across the surface of Mars since landing in 2012. Currently, it’s climbing and investigating a steep hill to learn more about its geologic formation. When Curiosity landed, that same hill was just a blip in the distance.
Curiosity landed in Gale Crater, a vast and dry ancient lake bed with a 16,404-foot mountain at its center. Mount Sharp’s peak is taller than the rim of the crater. Streams and lakes likely filled Gale Crater billions of years ago, which is why NASA landed the rover there in 2012.
The image processing team is an unseen element to the rover, but “you can’t live without us,” Gengl said. “We’re getting data and we’re sharing it with the world.
“Every day, I feel like I’m vicariously living through robots on Mars. Especially for rovers, because we’re exploring a new location we’ve never been to. Seeing the images coming down, I can’t help but think, ‘I’m one of the first people to see this data of this location on Mars.’ “

Seven years of Mars vistas

Every day for the last seven years, the rover team’s scientists have begun their day by waking up to the latest images Curiosity snapped of Mars and analyzing the images for exciting features.
Curiosity’s broad science team includes 500 scientists around the world, 40% of which are outside the US. They help make decisions for the science gathered by the rover’s ten instruments.
But everything starts with what they see. Each day, depending on the bandwidth of what the rover can capture and send back, they’re usually looking at a panorama consisting of a few dozen images.
Then the team of roughly 30 scientists get on a conference call. They talk about areas relating to their particular areas of focus, like geologists wanting to investigate unusual-looking rocks that appear in the images.
When the rover arrives at a new site, it’s commanded to capture as much of the area in detail as possible.
“That data set provides our first look at what’s really there,” said Ashwin Vasavada, the rover’s project lead scientist.
Vasavada said it’s not unusual to wake up to emails full of exclamation points from a geologist who noticed an interesting crack in a rock — and was awake at 3 a.m. to watch the image come down to Earth.
“Everyone has different things they’re excited about,” he said.
The rover takes new images every day, extends its robotic arm to investigate and place instruments strategically every few days and brings out its drill to sample material every few months, Vasavada said.
Every decision is a question of the science that could be returned, by the bandwidth and power used by the rover.
For example, the decision to keep the rover occupied over the Thanksgiving holiday made sense because the rover teams would be on break. Commands sent to the rover told it to sit still and focus on capturing images at the same time each day, which would provide even lighting for a wide panorama of its perspective.
The rover rarely has idle time when it’s not investigating specific things, so the timing worked out. And it happened during the winter season on Mars, when there’s less dust in the atmosphere. Vasavada said it was clearer on Mars during Thanksgiving than it was in Los Angeles.
After seven years — and 16 total spent on the mission from concept to landing for Vasavada — the rover teams are still excited to see what they learn each day.
They’ve checked off a lot of images and investigations on the wishlist.
But Vasavada and his colleagues want to investigate a transition in the rocks — from those rich in clay to others more rich in sulfate minerals.
“It may indicate a big climatic change in early Mars, when there was a lot less water around and the rocks became more salty,” he said. “This is one aspect of the landing site we’ve been talking about for years. I’m excited to see it. That’s one thing I want to see before the mission ends.”
And lessons learned from Curiosity’s one megapixel cameras, as well as the stationary InSight lander and previous rovers like Spirit and Opportunity, have informed future missions. The Perseverance rover, which will sport 20-megapixel color zoom-capable cameras, is set to launch in July.
Curiosity requires between 10 and 12 images for a 360-degree panorama. Perseverance will only require five.
With rovers and landers on Mars, and orbiters above, it’s safe to say we’ll be waking up to new images from the surface of another planet for a long, long time.

Read More