Researchers from US-based Rice University have developed a technology that not only allows several devices to see what their owners see but also keep track of what they need to remember.
Unveiled at the International Symposium on Computer Architecture (ISCA 2016) conference in Seoul, South Korea, on Tuesday, RedEye is a new technology from Rice’s Efficient Computing Group that could provide computers with continuous vision.
“The concept is to allow our computers to assist us by showing them what we see throughout the day,” said group leader Lin Zhong and co-author of the study.
“It would be like having a personal assistant who can remember someone you met, where you met them, what they told you and other specific information like prices, dates and times,” added Zhong, professor of electrical and computer engineering.
RedEye is the technology that computing industry is developing for use with wearable, hands-free, always-on devices that are designed to support people in their daily lives.
“A key enabler of this technology is equipping our devices to see what we see and hear what we hear. Smell, taste and touch may come later, but vision and sound will be the initial sensory inputs,” Zhong noted.
Zhong’s colleague and former Rice graduate student Robert LiKamWa removed the bottleneck of energy consumption while converting images from analog to digital format. He said the energy bottleneck was the conversion of images from analog to digital format.
“Real-world signals are analog, and converting them to digital signals is expensive in terms of energy,” he said, adding, “There is a physical limit to how much energy savings you can achieve for that conversion. We decided a better option might be to analyze the signals while they were still analog.”