It follows in the footsteps of companies like Apple and Google. Amazon has begun designing its own AI chips, according to an exclusive report from The Information. The hardware is designed for anything powered by Alexa, including the Echo, and would allow the virtual assistant to respond more quickly by adding speech recognition directly to the device. Right now, whenever a user makes an inquiry on an Alexa-powered device, there is a delay while the virtual assistant contacts the cloud in order to interpret the request. While Echo devices would continue to rely on the cloud for complex inquiries, adding speech recognition directly would allow Alexa to perform simple tasks, such as checking the time, without that cloud delay. Amazon acquired chip designer Annapurna Labs back in 2015, and has slowly begun churning out its own processors. It was only a matter of time before it started designing and producing chips specifically for its own hardware needs. The company has also begin hiring chip engineers for Amazon Web Services, signaling that it may be moving to its own proprietary chips for these data centers as well. It should be noted that Google and Apple have both designed their own AI chips, and Google also is using its own chips to support services such as Street View, Photos, Search and Translate. Amazon is just the latest company to go down this route, though it should be noted that just because the company is reportedly designing these chips does not mean it will achieve the performance from them that it desires.
Amazon has started designing a custom artificial intelligence chip that would power future Echo devices and improve the quality and response time of its Alexa voice assistant, according to a report today from The Information. The move closely follows rivals Apple and Google, both of which have already developed and deployed custom AI hardware at various scales. AI tasks, because they are so computationally intensive, often need custom-designed chips for the devices themselves and even custom-designed servers for data centers where AI algorithms are often trained, developed, and deployed from the cloud. While Amazon is unlikely to physically produce the chips, given its lack of both fabrication experience and a manufacturing presence in China, the news does pose a risk to the businesses of companies like Nvidia and Intel. Both companies have shifted large portions of their chipmaking expertise to AI and the future of the burgeoning field, and both make money by designing and manufacturing chips for companies like Apple, Amazon, and others. Amazon, which seeks to stay competitive in the smart home hardware market and in the realm of consumer-facing AI products, has nearly 450 people with chip expertise on staff, reports The Information, thanks to key hires and acquisitions the e-commerce giant has made in the last few years. Acquisitions include the $350 million purchase of Israeli chipmaker Annapurna Labs back in 2015, as well as the acquisition of security camera maker Blink late last year. The plan is for Amazon to develop its own AI chips so Alexa-powered products in its ever-expanding Echo line can do more on-device processing, instead of having to communicate with the cloud, a process that increases response rate times. Both Apple and Google have begun similar shifts. Apple has started developing its own chips for the iPhone, like the devices graphics processor and power management unit, and cutting off longtime suppliers as a result. (One Apple supplier, Imagination Technologies, was forced to sell itself to a private equity firm after Apple ended its contract with the company.) With regards to AI specifically, Apple designed a new neural engine as part of its A11 Bionic chip, which handles on-device processing for machine learning algorithms that power features like Face ID and ARKit apps. Google, on the other hand, has developed its own AI hardware for years, starting with its custom ASIC processor known as the Tensor Processing Unit. Custom-designed for its TensorFlow AI training platform, the TPU — which got upgraded last year with a version two — gives Google an edge when it comes to machine learning tasks. For instance, the TPU is what forms the basis of Google subsidiary DeepMinds AlphaGo system, and it helps Google stay competitive with fellow AI research powerhouse Facebook, which also designs its own AI training server hardware. In just the last two years or so, Google has shifted some of its knowledge here to the consumer arena, where its begun developing custom AI chips to power devices like its Clips camera. Google also designed the image processor in the Pixel 2, and the company started poaching Apple engineers last year to perhaps design the entire chip set of future Pixel devices. So for Amazon to compete against both Google and Apple, it clearly sees an advantage in both controlling the design of its own chips and improving AI products it already delivers as an extension of that.