Her might be a work of fiction set in the future, but it gives a clear picture of where our computing technologies are headeda world where the device, which is now the central part of all computing, is pushed to the sidelines as everyone interacts directly with the operating system. Throughout the movie, the protagonist is interacting with his operating system, OS1, but rarely does he sit in front of a PC or hold the smartphone in his hand. Samantha, as his virtual assistant likes to call herself, is Siri on steroids, though she doesnt sound so.
Across the world there are multiple teams working on ways to make computing more natural. The idea is to take interfaces that need traditional input methods like keyboards and mouse away and replace them with voice, gestures or, in extreme cases, thought. And dont think I am taking you on an Isaac Asimov-like trip of future technologies. In fact, some of these technologies are already here. Take Apples Siri or Googles Now, which are both effective in taking orders from you and executing simple searches. No, they still cant do what Samantha does, like sort your mail or analyse your voice and understand that you are in a bad mood.
Actually, Samantha does much more than sort mail. But as Vlad Sejnoha, chief technology officer, Nuance Communication, explains in a blogpost, even Samanthas first, strictly utilitarian incarnation is impressive. Her speech recognition, natural language understanding, speech generation, dialog, reasoning, planning, and learning all far exceed the current state of the art, Sejnoha writes. He should know, for Nuance is at the very forefront of technologies showcased in the movie. Two-thirds of Fortune 100 companies rely on Nuance solutions, which is also used in 7 billion mobile phones and 70 million cars. But what is really crucial for the success of voice technologies is the database, and Nuance has one of the largest libraries of speech data in the world. It is this large database that now lets devices understand much more that American accents.
Even as voice control becomes more common and effective in devices, we are also seeing gesture control make its presence felt. Though mostly gimmicky, most top-end smartphones now understand some gestures. Even televisions are tuning in, with smart TVs from Samsung being able to flip channels as you wave your hands.
At least two companies are working on rings that can convert gestures from you finger to digital signals that devices can understand. One of them, Fin, is from India, while the other, Ring, is a California-based start-up. On a slightly larger scale is Thalmic Labs, which is taking pre-orders for its MYO: A gesture-and-motion control armband that lets you use the movement of your hands to effortlessly control your phone, computer.
Another company that can make a huge difference in the way you interact with you computer is a Swedish firm called Tobii. They are the pioneers of eye-controlled devices and have for some years been working on perfecting their technology. You would not have to wait for long to seeno pun intendedthe Start menu pop up by just staring at the left-hand corner of your Windows PC. Lenovo is working with Tobii and has already showcased prototypes.
The next frontier so to speak in this evolution will be the ability to control your device with the power of thought. InteraXon already has software and hardware that lets you do this, though the entire process is cumbersome and awkward to say the least. Who wants to wear a headband just to make a computer work But brain-controlled interface is closer than you think, especially when you realise that a lot of gadgets for the physically-challenged already use this technology to control everything from robotic arms to wheelchairs. Yes, you can control a computer just by thinking about it.