Power grids across the world are undergoing a difficult, lengthy, and expensive transition. Not only do power companies need to meet rapidly growing demand while reducing carbon emissions, but they are also required to maintain a stable supply despite a huge shift to less reliable, more intermittent power sources. Many of these power sources are distributed, requiring a significant grid redesign and upgrade to ensure power flows efficiently from where it is generated to where it is needed.
Escalating power demands from the data center sector are creating a major challenge for power providers, especially in locations such as Northern Virginia (US), Ireland, London (UK), and Frankfurt (Germany). To meet demand, providers are investing heavily in new capacity and increasing flexibility — but this is not enough. They want data centers to do the same. For many data centers, especially those planning new projects, it will no longer be sufficient to be a passive customer.
Even before the emergence of generative AI and its significant power demands, there were concerns about meeting the power requirements of data centers. In some regions and countries, data center power consumption accounts for more than 10 percent of the grid load and is rising rapidly — compounded by growing power demands from sources such as electric vehicles and HVAC systems.
This surging demand has led to many applications for grid power being (or likely to be) denied or delayed, while aging infrastructure and extended equipment lead times hinder the timely deployment of new power generation.
Rising demand is also slowing down or reversing grid decarbonization efforts in some regions — despite regulatory mandates and public pressure to increase the pace. In the absence of viable alternatives, fossil fuel generation is being reintroduced to ensure there is an adequate supply of non-intermittent (firm) power available to meet demand.
These challenges have sparked a search for new power sources for data centers, exploration into new geographies, and improved methods to save, store, or distribute power. There is no single technology, however, that will have a significant impact in the next five years or more. With substantial new data center capacity scheduled to go online between 2025 and 2035, some regions are likely to face significant gaps between supply and demand.
Some companies building large campus facilities are exploring large-scale on-site power generation using gas turbines or fuel cells. These companies will be able to offset some of their investment by contributing excess power to the local grid. It will also help with planning and public acceptability while utilities invest in renewable power generation. But these are outliers.
Strategies that involve a range of actions, policies, and technologies and focus on increasing the efficiency and flexibility of existing and new assets include:
- The grid improves its use of power generation and distribution capacity through automation, transmission upgrades, and the application of intelligent management (including AI), advanced forecasting, and dynamic pricing.
- Large customers collaborate more closely with their grid suppliers, sometimes taking large loads off the grid to maintain frequency, voltage, or power availability. Data centers have participated minimally in this option so far, with operators often claiming their loads are either not flexible or need to be protected. However, it may not be an option to unconditionally opt out in the future.
Power management at scale
The intelligent management of power supply and demand dependent on specific needs or objectives (such as saving money or carbon reduction) is well understood but underutilized. Its benefits, too, are widely underestimated.
In the data center, power management and power capping in servers and software-defined power for the dynamic allocation of battery or mains power across racks are two examples where significant energy (10 to 30 percent) can be saved, with minimal performance impact. Microgrids for campus-wide optimization and the sharing of power sources can also use policy-driven software to reduce costs or carbon.
For the utility grid, the challenges are far greater in terms of complexity and scale. While the focus has largely been on increasing power generation, transmission, and storage capacity, significant reductions in power losses, costs, and carbon emissions can be achieved by analyzing capacity and asset use — and directing supply and demand.
Techniques being developed or deployed include dynamic line rating to exploit changes in capacity due to the weather and AI algorithms to create better real-time models of constant changing demand, which can be shaped by financial incentives.
Under demand response schemes, customers agree to curtail their demand for power (for a fee) at critical times. The International Energy Agency’s (IEA’s) Net Zero by 2050 Scenario, which is generally supported by energy companies and countries across the world, wants to see 500GW of demand response capacity introduced onto the grid by 2030 — a 10-fold increase on 2023 capacity.
Data centers are expected to play a part, as their share of consumption grows. Several utility operators and other parties at IEA’s Global Conference on Energy and AI towards the end of 2024 stressed the importance of demand response to help meet new energy demand — including from the data center sector. “Data centers are not flexible today. But they can be … and they will be,” said one executive from a power supplier in a stressed data center-heavy region.
Implications for data centers
What do the power companies mean by flexibility from data centers? There are three key areas:
- Collaboration. Utilities want data centers to share information earlier, and in more detail, on likely demand. They are also seeking collaboration on new standards for transferring loads to and from generators. Data center loads, especially if they all act the same way, can be substantial, and swings in demand from data centers are already causing some issues. Some utilities have also proposed greater collaboration between different industries with different, and possibly complementary, power profiles.
- Load shedding using on-site power. An increasing number of power companies want more data centers to load shed by using local on-site power sources — usually generators, but also UPS batteries (see below).
- Load shedding by time or location shifting. Some power companies want to encourage load shedding that does not involve using on-site power, which is often expensive and carbon-intensive. More data center operators, they think, can be persuaded (or contracted) to power down some workloads temporarily (time-based load shifting) or move the work to an alternative location (location-based load shifting). This is very rarely practiced today.
Shifting times, shifting loads
Currently, only a small proportion of operators have agreed to use their on-site power sources to support the grid, and this is mostly for stabilizing voltages and frequency. When UPS batteries and generators are used to help the grid, the risks of outages usually increase — albeit marginally. During this period, redundancy may be reduced (while the grid is potentially at its most vulnerable); and air quality and diesel running times may be exceeded. If the IT loads are postponed, slowed, or moved, this requires planning, capacity, and collaboration with IT clients, who often see little or no benefit.
Some workloads are different: AI training, for example, is not transaction-based or latency-sensitive. If well designed, training runs may be stopped and restarted without serious consequences. Their power, therefore, may be seen as a reserve by regulators and utilities. Bitcoin mining facilities are viewed in a similar way.
One example is in Texas, where the Electric Reliability Council of Texas (ERCOT) says forecast power from what it calls “large flexible loads” will rise 60 percent from 2024 to 2025. While participation in demand response is not mandatory, large data centers (and Bitcoin miners) are financially incentivized (and expected) to participate in a scheme to shed loads and protect grid reliability.
In Ireland, where data centers already account for 18 percent of all power use, new data centers are expected to agree to support the grid if required.
This expectation will increase. In the UK and Germany, government and utilities have discussed mandatory measures to force data centers to shed loads, possibly under critical national infrastructure policies. And some large operators may do so anyway. One hyperscale AI data center operator said: “If there is a regional grid crisis, we will shed load. The consequences of not doing so will be too great.”
Google is one example of a data center operator using demand response and intelligent load management to relieve grid pressures. It participates in schemes in Europe, Asia, and at several sites in the US. At critical times, it activates control systems that limit and reschedule non-urgent tasks.
Key consumer-facing services are not affected. Google has not, however, disclosed how much power has been (or may be) saved in this way — suggesting it is likely not yet high. In demand response, financial incentives are becoming stronger. Prices paid for curtailment or generator power vary widely, according to demand, urgency, and the level of notice provided by the utility. Flexible loads may provide new opportunities. Integrating data center gensets into the capacity auction market in the US PJM region (a large US power market region) in late 2024, and being accepted, would generate a significant flow of new revenue.
Large AI data centers could receive significant payments to stop running. During the Texas freeze in February 2021, the New York Times reported that the Bitcoin company Bitdeer was paid $18 million over four days to turn its computers off. While this is extreme, there are other examples of large payments. Those with big AI facilities are watching with interest.
In the longer term, power companies and regulatory bodies are likely to view these agreements as reactive and expensive. They will prefer to introduce a more collaborative and less expensive system, leveraging their position at an earlier stage in the planning and provisioning process.
Read the orginal article: https://www.datacenterdynamics.com/en/opinions/grid-demand-will-require-active-participation-from-data-centers/