How AI, Energy Requirements Are Shaping Data Center InvestmentHow AI, Energy Requirements Are Shaping Data Center Investment
As Microsoft invests $80 billion to build out AI-powered data centers, tech companies face energy challenges in the cost of AI infrastructure.
January 23, 2025
Earlier this month, Microsoft announced it would invest about $80 billion in data centers powered by AI. As part of its “golden opportunity for American AI,” the company will train AI models and roll out AI and cloud-based applications with this investment.
“In FY 2025, Microsoft is on track to invest approximately $80 billion to build out AI-enabled data centers to train AI models and deploy AI and cloud-based applications around the world,” wrote Brad Smith, vice chair and president of Microsoft, in a Jan. 3 blog post. “More than half of this total investment will be in the United States, reflecting our commitment to this country and our confidence in the American economy.”
Microsoft will invest more than $35 billion in investments across 14 countries to construct AI and cloud data center infrastructure. It is also collaborating with BlackRock and MGX to form an international investment fund that contributes up to an additional $100 billion for AI infrastructure and the AI supply chain, Smith wrote.
“Microsoft is all-in on AI much like they were with the cloud, and AI has eclipsed their cloud efforts much like the cloud eclipsed Windows,” says Rob Enderle, principal analyst with the Enderle Group, via email.
Tony Harvey, a senior director analyst at Gartner, also sees the investment as a sign that Microsoft is committed to AI and must invest to lead the market in this area. He expects Microsoft’s investments in AI data centers to focus on many large language models (LLMs) and then include AI agents.
“AI agents will be developed that will make decisions and take actions based on their inputs and environments,” Harvey says. “One key element that will need to be developed to enable these agents to interact in the physical world is an understanding of the physical world and its constraints.”
Hyperscalers Shell Out Billions on AI Infrastructure
Meanwhile, other hyperscalers are investing large amounts of funds in AI infrastructure with data center capacity shrinking. AWS announced a $11 billion investment in AI and cloud infrastructure in Georgia, and Google plans to spend $100 billion on AI over time, Google DeepMind CEO Demis Hassabis said last year, according to Bloomberg.
Earlier this month, AI hyperscaler Coreweave opened two AI data centers featuring Nvidia GPUs. Oracle and Elon Musk’s xAI have also pursued billion-dollar investments in AI data centers and their hardware, Harvey notes.
AI infrastructure spurred an 82% rise in hyperscale data center capital expenditures in the third quarter of 2024, according to research firm Dell’Oro Group’s Data Center IT Capex Quarterly Report.
Expect hyperscalers to invest in AI data centers outside typical locations like Virginia, according to Harvey. He notes that xAI has created AI data centers in a factory in Tennessee, AWS invested in a data center in Georgia, Microsoft constructed a $3 billion data center campus in Racine, Wisconsin, and Google is investing in Indiana and Kansas City, Harvey says. Meanwhile, Meta revealed plans for a $10 billion AI data center in northeast Louisiana.
Power Constraints Affect Data Center Capacity
GenAI in particular is generating a sharp increase in electricity demand as hyperscalers expand data center capacity, according to Deloitte’s “TMT Predictions 2025” report on generative AI. In November Gartner reported that 40% of AI data centers will be limited operationally by power availability by 2027.
As companies invest in building out AI-enabled data centers, a high demand for power could hamper those efforts. In fact, AI will affect how data centers will be built, according to Enderle. Companies may choose warm water cooling over air, or chilled water, he says.
Meanwhile, Harvey sees AI data centers shifting to more dense and modular designs.
“AI data centers need to be designed around 50 kilowatt (kW) to 100kW per rack with close physical proximity between devices,” Harvey says. “The older data center designs of 5kW-20kW per rack that spread out will be less popular — although there will still be need for traditional CPU-based compute.”
Gartner predicts that by 2027, data centers will require 500 terawatt-hours (TWh) of power to run “incremental AI-optimized servers,” an increase of 2.6 times from 195 TWh in 2023.
In its TMT Predictions report, Deloitte said that data centers comprise only 2% of global electricity consumption, or 536 terawatt-hours in 2025, but the growth of GenAI training and inference grows faster than “other uses and applications.” It says global data center electricity consumption could double to about 1,065 TWh by 2030.
Enderle predicts that nuclear and geothermal will be the key types of energy powering data centers.
“Microsoft and Google are looking at new or refurbished nuclear plants,” Enderle says. “Geothermal is less common.”
Power generation is a critical issue as new data centers are built out, Enderle explains.
“Energy is a significant problem, and there are efforts to create small-scale nuclear reactors (GE has a prototype out) that can be collocated and require little to no maintenance to resolve this problem,” Enderle says. “But the grid is not at all ready for these kinds of loads, making it critical that power-generation advancements be aggressively applied as these data centers are built out.”
About the Author
You May Also Like