NEW: FREE Web3 Resume Cheat Sheet DOWNLOAD NOW

Amazon Is Struggling to Challenge Nvidia’s AI Chip Supremacy

In this post:

  • Amazon faces low AI chip adoption as large cloud customers prefer Nvidia.
  • Amazon’s internal documents note compatibility gaps and migration issues from customers.
  • Nvidia’s CUDA platform is identified as the main reason users prefer the brand.

Amazon has been developing its own AI chips to cut costs, which has also helped increase Amazon Web Services (AWS) profitability. However, the e-commerce giant is struggling to develop AI chips that can rival Nvidiaโ€™s standard chips.

Project migration issues, compatibility gaps, and low usage are some of the concerns slowing the adoption of Amazonโ€™s AI chips. The situation has also put heavy revenues at stake that Amazon generates from its cloud business. The challenges Amazon faces were identified through confidential documents and sources familiar with the matter, as reported by Business Insider.

Amazonโ€™s In-House AI Chips Encounter Stifled Adoption

Trainium and Inferentia are the top-of-the-line Amazon-designed chips that debuted near the end of last year. The publication reported that last year, Trainiumโ€™s adoption rate among AWSโ€™s customers was merely 0.5% compared to that of Nvidiaโ€™s graphic processing units.

Also read: Amazon Profit Exceeds Wall Street Expectations as AWSโ€™s Generative AI Works Wonders

Amazon made the assessment to measure the usage percentage of different AI chips through its AWS services in April 2024, according to the report. At the same time, the adoption rate for Inferentia was slightly higher, with 2.7%. Inferentia is a special chip designed for inference, an AI task that usually refers to the computing process for AI model usage by end consumers. The report mentioned an internal document saying that;

โ€œEarly attempts from customers have exposed friction points and stifled adoption.โ€

The above statement refers to the challenges that large cloud customers have faced when transitioning to Amazonโ€™s custom chips. Nvidiaโ€™s CUDA platform is considered more appealing to customers, and the report identifies it as a key reason.

See also  Why professionals still don't trust AI and AI agents despite growing adoption - YouGov study

Amazonโ€™s Custom AI Chip Development Under Internal Review

AWS, the worldโ€™s largest cloud service provider, is now developing its home-brewed computer chips to facilitate operations. Amazon, at times, flaunts its AI chip efforts. However, the picture shown in the documents is different from what the company projects.

Amazon Is Struggling to Challenge Nvidia's AI Chip Supremacy
Singaporeโ€™s Minister for Communications and Information, Tan Kiat How, with AWS executives and partners. Source: AWS.

The internal documents state that the company is struggling with a slow adoption rate, but Amazonโ€™s CEO has different views. At its First quarter earnings call, Amazon CEO Andy Jassy said the demand for AWS chips was high.

โ€œWe have the broadest selection of NVIDIA compute instances around, but demand for our custom silicon, training, and inference is quite high, given its favorable price-performance benefits relative to available alternatives.โ€

Andy Jassy

Jassy also mentioned early adopters of AWS silicon chips in his investor letter, saying that โ€œwe already have several customers using our AI chips, including Anthropic, Airbnb, Hugging Face, Qualtrics, Ricoh, and Snap.โ€ At the same time, Anthropic is a different case altogether because Amazon is the heaviest backer of the startup. The cloud giant has invested $4 billion in Anthropic, and the investment deal binds it to use AWS-designed silicon.

A Major AWS Component Leverages Nvidia GPUs

Amazon Web Services offers a variety of processors, from Nvidiaโ€™s Grass Hopper chips to AMD and Intel. Most of its profitability comes from designing its own data center chips, which helps it save costs by avoiding buying GPUs from Nvidia.

See also  Perplexity AI wants to merge with TikTok US

Also read: Nvidia Experiences Remarkable Growth Amid Rising AI and GPU Demand

Amazon debuted its first AI chip, Inferntia, in 2018, but Nvidia still leads in offering solutions that are more widely adopted by different industries. AWS, Microsoft, and Google are some of Nvidiaโ€™s largest customers. All these giants rent GPUs through their cloud services. 

In March, Adam Selipsku, CEO of AWS, attended Nvidia GTC 2023. Both companies made a joint announcement focused on their strategic collaboration to advance generative AI. 

โ€œThe deep collaboration between our two organizations goes back more than 13 years, when together we launched the worldโ€™s first GPU cloud instance on AWS, and today we offer the widest range of NVIDIA GPU solutions for customers.โ€

Selipsku

Nvidiaโ€™s Platform, called Cuda, is usually preferred by developers. As Nvidia spent many years of time and effort in creating it, and the industry has adopted it, which makes handling things easier for them. On the other hand, Amazon still has to solve this puzzle with trial and error.


Cryptopolitan reporting by Aamir Sheikh

From Zero to Web3 Pro: Your 90-Day Career Launch Plan

Share link:

Disclaimer.ย The information provided is not trading advice.ย Cryptopolitan.comย holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Editor's choice

Loading Editor's Choice articles...
Subscribe to CryptoPolitan