The next NASA rover to Mars or another planet may be quipped with a new "do-it-all" camera that can identify, photograph and even analyse patches of soil or rocks close up and far away, scientists say.
The prototype of the Astrobiological Imager, developed by the University of Arizona, consists of an off-the-shelf digital camera with modifications such as LEDs, which would allow for spectral analyses of rock samples.
Researchers said that a slightly more sophisticated version, mounted on a rover, could do what even NASA's latest and greatest Mars rover, Curiosity, can't: identify, photograph and even analyse patches of soil or rocks from afar and in extreme close-up, all with the same camera.
The Arizona team figured out how to take advantage of different lens adapters that can be mounted in front of a single camera to enable it to take images ranging from a macroscopic scale - think landscape - all the way down to a microscopic scale - think cells and bacteria.
HiRISE, the UA-led High Resolution Imaging Science Experiment instrument aboard NASA's Mars Reconnaissance Orbiter, has imaged the Red Planet in unprecedented detail.
But as a space-borne instrument, it can only resolve features about the size of a kitchen table and is not capable of microscopic imaging. If the table were set with plates or anything smaller, HiRISE wouldn't know, researchers said.
The Astrobiological Imager, however, could image the table from far away, then move closer to take detailed shots of the dinnerware, and finally zoom in to take high-resolution pictures of a single salt crystal left on one of the plates.
"The idea is contextual imaging to subsequently zoom in on areas of interest in a nested fashion, until you hit the sweet spot, which you want to image microscopically. For example, to find microbial communities in rock formations," said Wolfgang Fink, from the UA Department of Electrical and Computer Engineering who led the project.
"Mounted on a rover, our camera would be equipped with a rotating turret containing different adapter lenses," he said.
"From an astrobiological point of view, you need the context first, so we'd use it in wide-angle mode to look around in search for promising targets, then drive to, say, a rock pile, image individual rocks, then go close to image patches potentially containing life, and then zoom in to produce a microscopic image of anything that might be living on or beneath that rock surface," Fink said.
Fink and his team