Space rovers may soon be able to take decisions on their own, thanks to NASA's new 'smart' camera system that can not only take pictures of alien rocks, but also draw meaning out of them.
This will allow a planet rover to decide if it should keep exploring a particular area or move on, researchers said.
The Curiosity rover exploring Mars boasts impressive technology but future rovers will need more smart technology to explore more distant worlds, they said.
To help future rover and space missions spend less time waiting for instructions from Earth, senior researcher Kiri Wagstaff and her colleagues developed an advanced two-lens camera, called TextureCam.
Although Curiosity and other rovers can already, on their own, distinguish rocks from other objects in photos they take, they must send images all the way to Earth for scientific analysis of a particular rock.
This process costs time and limits the potential scientific scope of rovers' missions. TextureCam can do the analysis by itself, researchers said.
At the beginning of each Martian day, called a sol, scientists on Earth upload an agenda to a Mars rover. This scientific schedule details nearly all of the rover's movements: roll forward so many meters, snap a photo, scoop a soil sample, run rudimentary tests on it and move on.
Even moving at light speed, instructions from Earth take about 20 minutes to reach the surface of Mars. This 40-minute round-trip makes real-time control of the rover impossible.
On Jupiter's moon Europa, where astrobiologists suspect extraterrestrial life could exist, the delay balloons to over 90 minutes.
Mars orbiters can help speed up the data transfer rate, though the satellites only orbit into correct alignment a few short minutes each day. Curiosity's constrained connection limits the number of Martian images it can send back to Earth.
"If the rover itself could prioritise what's scientifically important, it would suddenly have the capability to take more images than it knows it can send back.
That goes hand in hand with its ability to discover new things that weren't anticipated," said Wagstaff, a computer scientist and geologist at the Jet Propulsion Laboratory (JPL) in Pasadena.
When TextureCam's stereo cameras snap 3D images, a special processor separate from the rover's main computer analyses the pictures.
By recognising textures in the photos, the processor distinguishes between sand, rocks and sky. The processor then uses the size and distance to rocks in the picture to determine if any are scientifically important layered rocks.
The system's built-in