Investors’ attention is squarely fixed on the impending financial success of chip designer Nvidia Corp., as the company anticipates a significant surge in sales and earnings attributed to its AI-focused chips tailored for data centers. Despite the current spotlight on Nvidia, some astute investors are beginning to cast their gaze beyond the tech giant, searching for potential contenders in the realm of AI-chip development.
Nvidia presently holds the mantle as the leading provider of graphics unit processors (GPUs) utilized as AI accelerators. The anticipation of robust results from Nvidia’s fiscal second-quarter report bolsters the company’s stance. While Advanced Micro Devices Inc. and Intel Corp., Nvidia’s rivals in the semiconductor industry, are laboring to secure their shares of the lucrative AI-chip market, their offerings are not yet available, and Nvidia grapples with the challenge of meeting soaring demand.
This landscape has created a ripe opportunity for emerging semiconductor startups focusing on artificial intelligence. But, these startups face an intricate challenge in the race to supply the necessary hardware, particularly semiconductors, to fuel the AI revolution.
The uphill climb for AI-chip startups
In this endeavor, d-Matrix, a Santa Clara-based startup specializing in power-efficient chiplets, is striving to make its mark. Sid Sheth, the co-founder and CEO of d-Matrix, acknowledges the significant demand from customers eager to collaborate and pilot their offerings by the first half of the upcoming year. According to statements made by Sheth, there is substantial interest from numerous customers who are seeking alternatives due to challenges posed by Nvidia’s inability to meet demand and its cost-intensive nature. This has led to a quest for solutions that can potentially alter the economic dynamics of the market. He underscores that customers have expressed a keen interest in their products, with a readiness to make immediate purchases once available.
The company’s aspiration is to have its chip sets ready for pilot testing with customers by the first half of 2024, ultimately achieving mass production by the latter half of the same year. Similar to Nvidia, d-Matrix is partnering with Taiwan Semiconductor Manufacturing Co. to manufacture their semiconductors. They are utilizing older, more accessible manufacturing nodes that provide cost advantages. These chiplets, named Corsair, are specialized SRAM memory modules designed for executing countless operations within AI algorithms. Notably, they are engineered to complement Intel’s dominant x86 architecture, addressing bottlenecks in running generative AI applications.
Diverse approaches to AI-chip development
The landscape of AI-chip startups has seen a plethora of players seeking to either challenge or collaborate with Nvidia, focusing on various aspects of AI acceleration in both data centers and the periphery. Karl Freund, the founder and principal analyst at Cambrian AI Research, classifies these startups into two primary groups. The first concentrates on training, a process that demands numerous GPUs for iterative data processing—an area where Nvidia holds a significant advantage. The second thrust centers on inference, which involves decision-making based on available data, a realm Freund identifies as more crowded with companies. Importantly, this type of processing often occurs at the network’s edge, closer to users, offering efficiency advantages over cloud-based processing.
Freund predicts that Nvidia’s dominance in data centers will likely persist for the next three years, with emerging startups collectively capturing only a modest market share. He perceives the edge as the predominant arena for startup innovation, pointing to Qualcomm Inc. as a notable incumbent in this space.
The AI-focused chip startup ecosystem has garnered interest from venture capital, offering a glimmer of hope amid decreased funding and limited IPO activity. SoftBank Group Corp.’s Arm Holdings IPO is expected to revive the market and highlight AI-focused chip makers. VC investment for such firms dropped from $9 billion in 2021 to $4 billion in 2022. In 2023, $1.06 billion went into 97 deals for AI and machine learning-focused chip startups.
Cerebras Systems, valued at over $1 billion, stands out with its frisbee-sized chips, securing a substantial contract for AI supercomputers. Karl Freund sees it as a rival to Nvidia. Tenstorrent, has recently garnered $100 million in funding from notable investors including Samsung, LG Electronics, and Hyundai. Tenstorrent, backed by $100 million, adopts the RISC-V architecture, and its Grayskull card is available, while Blackhole arrives in 2024, supported by renowned processor designer Jim Keller as CEO.
The road ahead for AI-chip startups
As the AI-chip startup arena evolves, success is poised to favor companies providing comprehensive software solutions or all-encompassing ecosystems. Drawing from his Nvidia experience, Sasha Ostojic of Playground Global underscores software readiness as pivotal, given Nvidia’s potent software ecosystem. This poses a significant competitive edge, underlining the paramount role of software quality for triumph. Operating at a higher risk juncture than software startups, AI-chip ventures contend with intricate semiconductor design, software development, and manufacturing complexities, intensifying financial demands and risks.
As the sector advances, potential failures loom, but the allure of becoming the next Nvidia persists. Amid Nvidia’s AI-chip dominance, emerging players like d-Matrix, Cerebras, and Tenstorrent diligently strive to etch their positions within this dynamic landscape, marked by challenges in semiconductor development. Their efforts epitomize innovation, steering the industry toward a future driven by AI-powered technology.