TOPIC #3
AI and Energy
Will electricity supply become the deciding factor in the global AI race?
AI’s Growing Strain on the U.S. Power Grid: Strategic, Economic, and Policy Implications
The intersection of AI and electricity supply is an emerging strategic challenge for the United States. Ensuring adequate electricity supply for AI data centers will have profound implications for national security, economic competitiveness, and global leadership.
The Department of Energy estimates that data centers accounted for 4.4% of U.S. power consumption in 2023. By 2028, data centers could account for 6.7% to 12% of U.S. power consumption. By 2050, the electricity consumed for commercial computing is forecast to surpass any other end use in the commercial sector, including lighting, space cooling, and ventilation.
Amidst this backdrop, grid operators and state regulators are actively developing new policies to manage surging data center demand and grid stability impacts. Meanwhile, electric utilities are exploring AI-enabled solutions to address the very infrastructure challenges AI creates.

Key Takeaways
Brief Primer on the AI Technology Stack
The AI tech stack is a layered framework essential for building, deploying, and scaling artificial intelligence solutions in modern enterprises.
It consists of three main layers (see Figure 1):
- Infrastructure: Includes AI data centers, specialized chips (e.g., graphic processing units or GPUs), and core resources such as land, electricity, and connectivity.
- Platform: Encompasses AI software components, high-quality data management, and foundation models (large, pre-trained AI models). This layer enables rapid development, orchestration, and customization of AI solutions for diverse business needs.
- Applications: Features AI-infused software and smart devices that deliver intelligent automation, analytics, and user experiences.
FIGURE 1
The AI Technology Stack

Source: Microsoft
The Data Center Business Model: Turning FLOPS into Money
Next-generation data centers or “AI factories” are being designed specifically to maximize AI output and efficiency. These factories treat AI models as assembly lines, turning electricity and data into knowledge and profits (see Figure 2).
Key business model considerations include:
- Compute Power: The raw computing strength of an AI factory is measured in FLOPS (floating-point operations per second). In simple terms, a FLOP is a single mathematical calculation. Higher FLOPS means the factory can handle more complex AI tasks faster.
- Token Output: A token in AI is the smallest unit of data—such as a word, part of a word, or symbol—that an AI model processes to understand input and generate outputs like text or insights. AI factories generate revenue primarily through token production during inference (when the model responds to user requests). Each request processes and outputs billable tokens, directly linking factory output to revenue streams like API usage fees.
- Tokens per MW: A key efficiency metric for AI data centers, as it directly connects energy consumption to AI productivity (i.e., tokens generated per second) and revenue potential.
FIGURE 2
AI Data Center or “Factory” Business Model

Source: ScottMadden
The Data Center Business Model (Cont.)
The pace of technology advancements will also impact AI factories. Over the last eight years, the performance of NVIDIA GPUs (measured in FLOPS) increased 225%, while data center power use declined 43% (see Figure 3). However, it is unclear the degree to which efficiency gains might offset, in part, the expected exponential growth in data center activity.

FIGURE 3
Performance of NVIDIA GPUs over Time, 2016-2024

Note: 1GPT-MoE Inference Workload = A type of workload where a GPT-style model with a Mixture-of-Experts (MoE) architecture is used for inference (i.e., making predictions). Annual token revenue assumes a $0.01 per token cost. EF means exaflop, a measure of performance for a supercomputer that can calculate at least one quintillion floating point operations per second.
Source: Bond Capital; NVIDIA (’25)
Measuring Performance with Tokens
Major AI players are increasingly highlighting token metrics in performance updates. For instance, in its FY25 Q3 earnings call, Microsoft reported processing more than 100 trillion tokens, up 5X year-over-year, with cost per token more than halved—directly tying AI scale and efficiency to energy inputs.
AI Data Centers: A National Security Interest
The United States and China are in a race to build superior AI capabilities. Winning the AI race may result in military superiority, economic dominance, and geopolitical influence over the coming decades.
The United States currently maintains a competitive advantage in this high-stakes race across multiple AI technologies and building blocks, but China is working aggressively to catch up (see Figure 4).
China’s AI ambitions are multipurpose. The U.S.-China Economic and Security Review Commission notes that China “is developing AI not only to advance China’s economic growth more broadly but also for military applications, such as autonomous unmanned systems, data processing, decision-making, and cognitive warfare.”
China is also working to diffuse their AI technologies globally:
- A state-backed Chinese startup—Zhipu AI—is offering AI solutions in the Middle East, Southeast Asia, and Africa. OpenAI claims the startup’s goal is to “lock Chinese systems and standards into emerging markets.”
- In recent decades, China used a similar strategy to successfully deploy Huawei telecommunication networks around the globe.
FIGURE 4
The United States and China Compete for AI Dominance

Source: U.S.-China Economic and Security Review Commission, 2024 Annual Report to Congress (November 2024)
Electricity Supply as a Strategic Priority
In the United States, electricity supply for new data centers has the potential to become a limiting factor in the global AI race.
The Center for Strategic and International Studies notes that “speed-to-power” or the amount of time it takes a data center to receive electricity supply is a “central principle driving data center investment in the near term.”
In an emerging AI world, electricity supply is a strategic priority with national security implications. Without expanding the electricity supply and accelerating speed-to-power, the United States risks undermining its current—but tentative—competitive advantage over China.


The rapid development of China’s AI sector has heightened competition between American and Chinese AI, with much of this likely to play out during the next four years in international markets around the world.... While the U.S. government rightly has focused on protecting sensitive AI components in secure data centers through export controls, an even more important element of this competition will involve a race between the United States and China to spread their respective technologies to other countries. Given the nature of technology markets and their potential network effects, this race between the U.S. and China for international influence likely will be won by the fastest first mover…. The best response for the United States is not to complain about the competition but to ensure we win the race ahead. This will require that we move quickly and effectively to promote American AI as a superior alternative. And it will need the involvement and support of American allies and friends.
—Brad Smith, Vice Chair & President, Microsoft (Jan. 3, 2025)
Developers Explore Power Generation Options
AI and technology companies are increasingly pursuing diverse strategies to secure electricity supply.
Recent examples include:
- xAI constructed a Memphis data center in just 122 days, installing ~35 gas turbines. In July 2025, it announced plans to import and rebuild a combined-cycle plant to power a second Memphis facility.
- Texas Tech University System announced a partnership with Fermi America in June 2025 to create an AI and energy campus near Amarillo. It plans to generate up to 11 GW, using natural gas, solar, wind, and four 1 GW Westinghouse AP1000 nuclear reactors, with construction starting in 2026 and first-phase operations by 2032.
- Google announced a partnership with Intersect Power in December 2024 to develop ~4 GW of solar and wind with 10 GWh of battery storage to support industrial parks housing its data centers.
- Diversified Energy, FuelCell Energy, and TESIAC announced in March 2025 a 360 MW off-grid data center generation initiative across VA, WV, and KY.
- In May 2025, Google secured approval from the Nevada PUC for its initiative to purchase 115 MW of geothermal power from NV Energy produced by Fervo Energy, marking a first-of-its-kind procurement under the Clean Transition Tariff.
- Since August 2024, Meta has secured geothermal deals totaling 300 MW. This includes 150 MW with XGS Energy in New Mexico and another undisclosed 150 MW with Sage Geosystems east of the Rockies.
Without acceleration of regulatory approvals for grid-supplied generation and power delivery infrastructure, the shift toward self-generation is expected to accelerate.

Grid Operators, Regulators, and States Consider Reliability and Data Center Growth Policies
As data center loads proliferate, some grid operators and state regulators are considering new policies to alleviate concerns about grid impacts.
In ERCOT, some data centers, particularly cryptocurrency miners, have been observed to react to voltage disturbances by abruptly reducing consumption. Since November 2023, cryptocurrency mining has triggered 25 load loss events, ranging from 100 MW to 400 MW, leading to frequency disturbances and weakening grid stability.
In response to these challenges, the North American Electric Reliability Corporation (NERC) has formed a task force to explore strategies for mitigating voltage disturbances. The task force proposes two solutions:
- Establishing voltage ride-through standards for large loads
- Improving grid planning to reduce load tripping
NERC aims to release reliability guidelines in Q1 2026. Both approaches require detailed modeling and collaboration with data center developers, indicating increasing regulatory scrutiny of large energy users.

Grid Operators, Regulators, and States Consider Reliability (Cont.)
States are also responding with legislation covering curtailment, consumer protections, and electricity generation. Recent examples include:
- Texas: Enacted SB 6 in June 2025, requiring new large loads (≥75 MW) to install remote-controlled shutoff equipment, allowing ERCOT to disconnect non-critical loads during emergency firm load-shedding events.
- Oregon: Passed HB 3546 in June 2025, establishing a new utility rate class for data centers and crypto miners. The law requires 10-year contracts and cost recovery, ensuring residential ratepayers don’t subsidize infrastructure costs.
- Utah: Enacted SB 132 in March 2025, permitting data centers to self-generate or procure power from third parties if utilities cannot serve the load.
FERC is actively considering the implications of increasing co-location of large loads such as data centers at generating facilities, especially in PJM. In a February 2025 order, FERC ordered PJM to revisit its tariff to address the following concerns, among others:
- Lack of clarity and consistency on co-location arrangements, including rates, terms, and conditions of interconnection service specific to co-located large loads
- Whether co-located loads may benefit from transmission services as well as ancillary and black start services and what costs, if any, should be allocated to those loads
- Potential reliability and resource adequacy risks of co-located loads
The FERC docket remains open, and both FERC and RTOs/ISOs continue to consider these issues.

Hyperscalers have so far been hesitant in exploring market participation pathways, indicating to PJM that the risk of interruptions, especially for customer-facing processes, far exceeds any economic value of participation under current incentives/markets.
—Presentation by Tim Horger, Senior Director, PJM Forward Market Operations & Performance Compliance (at a May 9, 2025 large load additions workshop)
Exploring an AI-enabled Electric System
Multiple partnerships and initiatives are testing the ability of AI to enhance plant construction and grid operations.
Some notable examples:
- PJM is partnering with Google to use Google Cloud and DeepMind AI to automate and optimize PJM’s generation interconnection queue, processing thousands of applications by integrating databases into unified models for faster verification and planning.
- Exelon is using NVIDIA AI and machine learning to allow drone-based asset inspections to detect defects in real time.
- Palantir partnered with The Nuclear Company to develop the Nuclear Operating System (NOS), an AI-driven platform for accelerating U.S. nuclear reactor construction.
- Microsoft, in collaboration with Terra Praxis, plans to use AI to streamline and accelerate the repowering of retired coal plants with advanced nuclear.
- Schneider Electric launched One Digital Grid Platform, an integrated and AI-powered platform designed to enhance grid resiliency, reliability, and efficiency.
In a broader industry effort, the Electric Power Research Institute (EPRI) launched the Open Power AI Consortium. This consortium aims to develop and deploy AI across utility and grid operations to enhance grid reliability, optimize asset performance, and improve energy management (see Figure 6).
FIGURE 6
Key Objectives of EPRI’s Open Power AI Consortium

Source: EPRI
Implications
Electricity supply has the potential to become a fundamental constraint on U.S. AI dominance. Without addressing speed-to-power challenges and ensuring secure, reliable electricity access, America risks ceding its technological advantage to China. This makes electricity supply not only an economic issue but also a strategic national security priority, requiring immediate attention and coordinated action across government and industry.
Policymakers, utilities, and technology companies will need to closely collaborate to accelerate the deployment of new generation resources, modernize grid infrastructure, and streamline regulatory processes. Without proactive investment and innovation, regional disparities in electricity access could emerge, potentially limiting where next-generation AI infrastructure can be built within the United States.
These dynamics underscore the urgency of coordinated action to ensure that energy availability does not become a bottleneck for U.S. innovation and leadership in AI.
CONTACT OUR EXPERTS
On AI and Energy

Luke Martin
PARTNER
lukemartin@scottmadden.com 919.781.4191

Jon Kerner
PARTNER
jkerner@scottmadden.com 404.814.0020
RECENT INSIGHTS
Available at scottmadden.com
ScottMadden posts energy and utility industry-relevant content and publications on a regular basis. The list below is a sample of recent insights prepared by our consultants.