AI data centers will play a key role in the global AI arms race, as companies like Google, Amazon, and Microsoft invest billions of dollars in building the infrastructure to support generative AI. But as more AI data centers are built, companies face challenges including community skepticism and massive energy demands.
The global race for AI infrastructure is heating up as big tech companies continue to announce plans to build AI data centers in the U.S. and abroad. Google plans to invest 1 billion euros ($1.1 billion) to expand its data center in Finland to make it AI-ready, and $2 billion in an AI data center in Malaysia, while Amazon said it plans to spend $11 billion on a new data center in Indiana.
The US government recognizes the economic impact of AI data centers, and President Joe Biden has endorsed Microsoft's $3.3 billion investment in the construction of an AI data center in Racine, Wisconsin. President Biden noted that the AI ​​data center will create 2,300 union construction jobs and 2,000 full-time jobs in the long term. Microsoft also said it will provide Wisconsin workers with upskilling opportunities.
Alvin Nguyen, an analyst at Forrester Research, said U.S. government leaders will become involved in educating them as AI data centers emerge. For now, he said, there is “fear, uncertainty and doubt” as communities question the value of AI data centers due to concerns that they will consume large amounts of local energy resources while many jobs will be automated.
AI data centers are very different from traditional data centers, Nguyen said. But they still require a workforce to operate, so upskilling and education will be key. Assessing local energy needs will also be something government officials will need to consider going forward.
“AI data centers can be good business,” Nguyen said, “but they have to be balanced with residents and other businesses.”
Energy consumption of AI data centers raises concerns
A data center is a facility consisting of multiple racks containing computing infrastructure that supports IT systems such as servers and data storage.
In a traditional data center, a typical workload would use 4 to 10 kilowatts (kW) of power per rack, Nguyen said. A generational AI rack in a traditional data center would consume more than 200 kW of power, he said. An AI query from a model such as OpenAI's ChatGPT is estimated to require more than 10 times the power of a traditional Google search query, according to a report from the Electric Power Research Institute.
This is a two-order of magnitude increase in power density. For many organizations with older data centers, this increase in power density is significant. Alvin Nguyen, Analyst, Forrester Research
“That's a two-order of magnitude increase in power density,” Nguyen says. “For a lot of organizations that have older data centers, increasing power density by that amount is a big deal. It's hard to implement that later.”
Nguyen said the challenge is to increase power density and cool data centers. Air cooling is common in traditional data centers, he noted. But as data centers exceed 50 kW per rack, liquid cooling is required. Liquid cooling transfers heat to water, where it is stored in cooling ponds.
The higher power density required in new AI data centers will also require upskilling of workers accustomed to working in traditional data centers, he said.
“There are fans and noise and it can be dangerous to be inside and sometimes it's too hot or too noisy to stay inside for long periods of time,” Nguyen said.
Like the internet and communications technology, it's important to make AI capabilities as widely available as possible, he said. AI is already part of the digital divide that lawmakers have spent years trying to address.
“Those who can harness AI have a clear advantage over those who can't,” Nguyen said. “Competition for resources will become more prevalent, but this is where education for local government officials is needed, and we need new ways to create toolkits that show them how much power is available.” [and] Does it even make sense to have a data center?
More AI data centers to come to the US
Nguyen said that while it won't come in the form of a massive AI data center like the one Microsoft plans to build in Racine, we'll likely start to see smaller, localized AI data centers pop up in the future, especially in areas where companies face data regulations.
One example of the need for smaller, localized AI data centers is the data sovereignty laws of countries like France and China, which require personal data to be stored within the country. Another reason for the need for localized data centers is to reduce latency.
In fact, generic large-scale language models are expensive to build and not suitable for advising on specific problems, said Brian Hopkins, vice president of emerging technologies at Forrester Inc. Organizations will likely need more customized models, he said.
“The generic standard instances of GPT are not well suited to advising customers on what tensile strength steel they should purchase for their engineering needs,” Hopkins said.
Demand for AI training and inference at edge locations will far exceed what AWS, Microsoft and Google can serve, he said, which is why big tech companies are investing billions of dollars in AI data centers.
Hopkins said that rather than consolidating all applications in existing data centers, the company has shifted its strategy to decentralizing AI data centers to meet the specific data training needs of companies.
Mackenzie Holland is a senior news writer covering big tech companies and federal regulation. Prior to joining TechTarget Editorial, she was a general assignment reporter for the Wilmington StarNews and a crime and education reporter for the Wabash Plain Dealer.