In a bid to deliver transparency in technology and stay ahead of ethical pitfalls, Google has said that its Artificial Intelligence (AI) calling system "Duplex" would now identify itself while making appointments.
In a bid to deliver transparency in technology and stay ahead of ethical pitfalls, Google has said that its Artificial Intelligence (AI) calling system “Duplex” would now identify itself while making appointments.
Following the launch of the “Duplex” system, which lets AI mimic a human voice to make appointments and book tables, among other functions, a widespread outcry over the ethical dilemmas were raised by tech critics.
Google clarified to The Verge that the experimental system would have a “disclosure built-in” that means that whenever Duplex gets involved in some type of verbal communication with a human at the other end, it would identify that the human is talking to an AI.
“We understand and value the discussion around Google Duplex, as we have said from the beginning, transparency in the technology is important,” a Google spokesperson was quoted as saying.
“We are designing this feature with disclosure built-in, and we will make sure the system is appropriately identified. What we showed at I/O was an early technology demo, and we look forward to incorporating feedback as we develop this into a product,” the spokesperson added.
Google CEO Sundar Pichai introduced Duplex earlier this week in the company’s annual developer’s conference Google I/O and demonstrated how the AI system could book an appointment at a salon and a table at a restaurant.
In the demo, the Google Assistant sounded like a human. It used Google DeepMind’s new WaveNet audio-generation technique and other advances in Natural Language Processing (NLP) to replicate human speech patterns.
However, tech critics raised questions on the morality of the technology saying it was developed without proper oversight or regulation.
According to tech critic Zeynep Tufekci, the demo was “horrifying” and the initial positive audience reaction at I/O was evidence that “Silicon Valley is ethically lost, rudderless and has not learned a thing”.
Google had originally said in a blog post written by engineers Yaniv Leviathan and Yossi Matias that “it’s important to us that users and businesses have a good experience with this service and transparency is a key part of that”.