🔥 Trade with Pros on Discord → 21 Days Free (No Card)JOIN FREE

Either chips stocks are absurdly cheap or someone is lying – Amazon’s $100M+ AI investment case study

In this post:

  • Amazon plans to invest $100 billion in CapEx for 2025, joining other tech giants in AI and data center investments despite mixed Q4 results.
  • CEO Andy Jassy assures long-term value from AI investments, with a focus on AWS and AI infrastructure amid growing demand for generative AI.
  • Analysts question the validity of massive AI spending, citing Chinese startup DeepSeek’s success and concerns over U.S. data center power shortages.

On February 6, e-commerce giant Amazon announced it will boost its spending in 2025 with a projected $100 billion earmarked for capital expenditures (CapEx). The announcement follows a fourth-quarter earnings report that saw mixed results for the company. 

While Amazon beat expectations on both the top and bottom lines, weaker-than-expected sales for the current quarter overshadowed the positive numbers. As a result, Amazon’s stock has fallen by more than 2.6% pre-market trading, according to CNBC charts

The “ambitious” spending plan places Amazon alongside other tech titans like Meta, Alphabet, and Microsoft, all of which have recently proposed investments north of $65 billion in data centers and AI infrastructure. However, analysts are raising questions about the rationale behind these investments, particularly in light of recent price dips in chip stocks.

Capital expenditures towards AI surge

Yesterday, Amazon CEO Andy Jassy tried to reassure investors that the increased spending would be worthwhile in the long run. 

During a call following the company’s earnings release, Jassy explained that the majority of the $26.3 billion in CapEx spent in Q4 was directed toward AI for Amazon Web Services (AWS). Jassy projected that this would be a good representation of Amazon’s annualized CapEx rate for 2025.

We’re focused on AI as a once-in-a-lifetime business opportunity,” Jassy told reporters. “This capital opportunity will benefit both our business and shareholders over the medium to long term.” 

See also  Nintendo Will Discontinue X (formerly Twitter) Integration for Switch Users On June 10th

Amazon, like its competitors, is investing heavily to keep pace with the exponential demand for generative AI, which has surged since the launch of OpenAI’s ChatGPT in late 2022. The company has unveiled a range of AI products, including its own Nova models, Trainium chips, and a marketplace for third-party models called Bedrock.

Is the spending really worth it?

However, questions loom over whether such massive CapEx spending, especially towards AI and chip-making companies, is justified. In a blunt post on X, Capital markets analysts The Kobeissi Letter appeared to be perplexed by the $320 billion in CapEX investment pledges made by the four tech startups.  

“Either chips stocks are absurdly cheap or someone is lying.” they probed.

Chinese AI startup DeepSeek, for example, has claimed that it took just two months and a budget of less than $6 million to develop its R1 model, which it asserts rivals OpenAI’s popular GPT-3 model. 

DeepSeek’s short-term “success” caused the value of chipmakers Nvidia and Broadcom plummeted by a combined $800 billion, and if the startup’s claims about training their models with much less funding is true, then American companies could be in over their heads.

Moreover, the infrastructure that could actually make chips these companies are investing in, is a little lucklustre. Data centers in the US are barely keeping up with the growing demand for AI services, due to power constraints.

See also  What Lies Ahead in AI Transition Challenges? A Future Hangs in the Balance

According to a recent study by RAND, the global demand for data center power could increase by 68 gigawatts (GW) by 2027. This would nearly double the global energy requirements for data centers from 2022 levels and approach California’s total power capacity of 86 GW.

The situation is particularly dire for data centers handling large AI training operations. These centers could require up to 1 GW of power by 2028 and as much as 8 GW by 2030, according to the research. 

The US currently leads the world in both data centers and AI compute, but with demand outpacing supply, there are concerns that companies within the jurisdiction may be forced to relocate some of their infrastructure abroad. This could have serious implications for both the competitiveness of the US tech industry, not to mention the security of intellectual property.

If you're reading this, you’re already ahead. Stay there with our newsletter.

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Most read

Loading Most Read articles...

Stay on top of crypto news, get daily updates in your inbox

Editor's choice

Loading Editor's Choice articles...

- The Crypto newsletter that keeps you ahead -

Markets move fast.

We move faster.

Subscribe to Cryptopolitan Daily and get timely, sharp, and relevant crypto insights straight to your inbox.

Join now and
never miss a move.

Get in. Get the facts.
Get ahead.

Subscribe to CryptoPolitan