Move over, touchscreens and buttons! A new Google sensor lets you control gadgets using hand gestures made in air.
A tiny interaction sensor that uses radar to translate subtle hand movements into gesture controls for electronic devices such as watches and cellphones has been unvieled by Google.
The sensor fits onto a chip and is able to track sub-millimetre hand gestures at high speed and accuracy with radar, and use them to control gadgets without physical contact.
The sensor could remove the need for designing knobs and buttons into the surface of products such as watches, cellphones and radios, Dezeen reported.
“Radar has been used for many different things: tracking cars, big objects, satellites and planes,” said Project Soli founder Ivan Poupyrev.
“We’re using them to track micro motions; twitches of humans hands then use it to interact with wearables and integrated things in other computer devices,” Poupyrev said.
Waves in the radio frequency spectrum are emitted at a target by the chip. The panel then receives the reflected waves, which are transferred to a computer circuit that interprets the differences between them.
The team at Google’s Advanced Technology and Projects group (ATAP) is able to extract information from the data received and identify the intent of the user by comparing the signals to a database of stored gestures.
These include movements that mimic the use of volume knobs, sliders and buttons, creating a set of “virtual tools”.