With the move, Amazon hopes to make its work cheaper as well as quicker.
Recently several tech companies have been moving away from third-party chip providers.
Amazon Alexa: Technology giant Amazon has announced that it will be moving a portion of its computing for smart voice assistant Alexa to its own custom chips. The move is bad news for Nvidia, which has been providing chips for the computing of Alexa till now. With the move, Amazon hopes to make its work cheaper as well as quicker.
As the process stands today, when Amazon’s smart speaker Echo is used by the customer to ask Alexa a question, the query is received at one of the data centres set up by Amazon where several steps of processing are carried out. Once the computer responds to the query, the response has to be translated into audible speech so that the smart voice assistant can relay the answer to the user.
This computing was done by Amazon previously with the help of Nvidia chips. However, most of this processing will now be rerouted to Amazon’s own custom-made chip called Inferentia, which had been first announced back in 2018. The chip has been specially designed in a manner that it speeds up machine learning tasks like recognition of images or text to speech translation at large volumes.
Companies like Microsoft, Google and Amazon, which are in the business of cloud computing, are among the biggest buyers of computing chips, majorly benefiting for chip-making giants Intel and Nvidia.
However, recently several tech companies have been moving away from third-party chip providers. Days before Amazon’s decision, iPhone maker Apple launched its latest generation of MacBooks, all of which now use its in-house chips instead of Intel like the previous generations.
Amazon has justified its move away from Nvidia by stating that it found a 25% improvement in latency and 30% reduction in cost by moving some of the work for Alexa to Inferentia.