Amazon shifts some voice assistant, face recognition computing to its own chips

First announced in 2018, the Amazon chip is custom-designed to speed up large volumes of machine learning tasks such as translating text to speech or recognising images

November 13, 2020 12:01 pm | Updated 12:01 pm IST

Cloud computing customers such as Amazon, Microsoft Corp and Alpahbet Inc's Google have become some of the biggest buyers of computing chips, driving booming data centre sales at Intel Corp, Nvidia and others

Cloud computing customers such as Amazon, Microsoft Corp and Alpahbet Inc's Google have become some of the biggest buyers of computing chips, driving booming data centre sales at Intel Corp, Nvidia and others

Amazon.com Inc on Thursday said it shifted part of the computing for its Alexa voice assistant to its own custom-designed chips, aiming to make the work faster and cheaper while moving it away from chips supplied by Nvidia Corp.

When users of devices such as Amazon's Echo line of smart speakers ask the voice assistant a question, the query is sent to one of Amazon's data centers for several steps of processing. When Amazon's computers spit out an answer, that reply is in a text format that must be translated into audible speech for the voice assistant. Amazon previously handled that computing using chips from Nvidia but now the “majority” of it will happen using its own ‘Inferentia’ computing chip. First announced in 2018, the Amazon chip is custom-designed to speed up large volumes of machine learning tasks such as translating text to speech or recognising images.

Cloud computing customers such as Amazon, Microsoft Corp and Alpahbet Inc's Google have become some of the biggest buyers of computing chips, driving booming data centre sales at Intel Corp, Nvidia and others.

But major technology companies are increasingly ditching traditional silicon providers to design their own chips. Apple on Tuesday introduced its first Mac computers with its own central processors, moving away from Intel chips.

Amazon said the shift to the Infertia chip for some of its Alexa work has resulted in 25% better latency, which is a measure of speed, at a 30% lower cost.

Amazon has also said that ‘Rekognition’, its cloud-based facial recognition service, has started to adopt its own Inferentia chips. However, the company did not say which chips the facial recognition service had previously used or how much of the work had shifted to its own chips.

The service has come under scrutiny from civil rights groups because of its use by law enforcement. Amazon in June put a one-year moratorium its use by police after the killing of George Floyd.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.