Get your free copy of Editor's Digest
FT editor Roula Khalaf picks her favourite stories in this weekly newsletter.
Nvidia CEO Jensen Huang has long argued that the world's data centers need to be overhauled to keep up with the demands of generative AI: He claims it will cost $1 trillion over the next four to five years to train and run new AI models, double what's already being poured into digital infrastructure.
Nvidia itself has been the biggest beneficiary of this, obviously: its stock price has skyrocketed, making it the most valuable company in the world.
But a flurry of earnings reports and AI-related deals over the past two weeks has also provided encouraging evidence that the boom sparked by the launch of OpenAI's generative AI chatbot, ChatGPT, is spreading. Of course, it remains to be seen whether this wave of spending is sustainable or large enough to justify a major rally in tech stocks, but it has at least provided some relief to bulls.
For example, shares of semiconductor maker Broadcom Inc. have risen more than 20% since the company said its AI adoption had helped drive sales. If it repeats that surge, it will join a rare group of tech companies valued at more than $1 trillion, more than six times what it was worth five years ago.
Much of the increase is driven by demand for AI accelerators — the chips that Broadcom custom designs to speed up AI calculations for customers like Google — but the surge also signals that high-speed networks are playing a more important role in data centers.
The massive amounts of information needed to train and run AI models require much faster connections between individual processors and the various machines running in data centers: Broadcom CEO Hock Tan estimates that networking will account for 40% of the company's AI chip sales by the end of this year.
Meanwhile, shares of cloud laggard software maker Oracle rose 17 percent on news of a deal to train OpenAI's large-scale language models on its cloud infrastructure, including bringing OpenAI partner Microsoft's Azure cloud services into giant new Oracle data centers, a once-unthinkable relationship between two of tech's oldest adversaries.
Meanwhile, Hewlett Packard Enterprise, which was on the verge of missing out on the surge in demand for AI servers that has buoyed rivals Supermicro and Dell, finally got its chance on Wall Street: Its shares rose 24% after the earnings report as investors reassessed the company's place in the AI boom.
As news like this raises hopes that the AI boom is spreading to more suppliers, a few things are becoming clear: For one, its impact appears to be widespread, reaching many different parts of the “technology stack” – the hierarchical structure of components, from chips to software, needed to run today’s complex IT systems.
NVIDIA remains positioned to be the overwhelming winner: Most of the company's sales come not from individual chips but from entire racks of networked servers. To extract the best performance, all the elements of these systems need to be coordinated and work together, using NVIDIA's unique techniques in areas like networking.
Nvidia's major customers are desperate to reduce their reliance on the company and are pushing for new standards in everything from networking to AI software that will allow more competitors to emerge, but these efforts will take time.
Big tech companies are becoming directly involved in more parts of the infrastructure needed for AI. For example, a key part of Apple's AI announcement last week was the news that the company is designing its own servers based on its own chip designs. Apple already controls most of the key components in its phones. As the demands of AI force the company to move the processing of customer data back to its own facilities, we will likely see a similar move in data centers.
As a result of these developments, supplier business models have had to adapt, with companies like Broadcom taking on a support role while customers take on more control. The largest cloud companies, the so-called “hyperscalers,” also take a larger share of overall demand, leading to a narrower reliance on large customers. This makes suppliers more vulnerable in the event of a downturn. But for now, Wall Street is fixated on how many tech companies will rise to the occasion as generative AI rises.
RichardWaters@ft.com