An orchestra thrives on harmony, every instrument in tune and in lockstep with the conductor’s cues. However, even if one note drifts out of sync, an entire symphony can be ruined.
The electrical grid is very much the same. Synchronization is essential to ensure that every generator and power source delivers electricity in exact alignment with the system’s frequency, voltage, and phase angle. Failure to do so can trigger severe issues, including grid instability, equipment damage, and mechanical stress.
As energy demand for AI compute continues to skyrocket, conventional wisdom has placed these facilities at odds with the grid, a system never designed for their unique power profiles. This situation is worsened by the cumbersome pace of grid infrastructure expansion, which lags far behind the insatiable appetite for power emanating from the data center sector. In hotspots like Virginia, data centers are waiting as long as ten years just to secure a connection, while local ratepayers are saddled with the growing costs of building out new transmission and power infrastructure to meet soaring demand.
For Dr. Varun Sivaram, CEO of Nvidia-backed startup Emerald AI, however, compute is the solution to this problem, and all it needs is the right conductor. Launched earlier last year following a $24 million seed round, Emerald AI has positioned itself to hold the baton in this new form of orchestra. The company believes AI data centers don’t need to be rigid, large loads on the grid, and instead can become flexible, grid-supporting assets, bringing clarity to this electronic symphony.
A conductor on the grid
At the core of Emerald AI is its Emerald Conductor platform. Described by Sivaram as “an AI for AI,” the system orchestrates thousands of AI workloads across one or more data centers, dynamically adjusting operations to respond to grid conditions while ensuring the facility maintains performance.
The system achieves this through a closed-loop orchestration platform comprising an autonomous agent and a digital twin simulator. Designed to work independently and to rely on very simple inputs, the Conductor system tags jobs with priority or tolerance for slowing. It then dynamically manages compute demand, slowing specific processes, shifting workloads between locations, or adjusting chip clock frequencies, depending on performance requirements and grid signals.
“Emerald Conductor orchestrates AI factories, reduces grid stress, and maintains performance,” explains Sivaram. “We make the AI data center flexible. We accept grid signals, forecast, and orchestrate.”
Therefore, if a grid operator signals that it needs the power load to drop, the Conductor can modulate workloads and achieve the reduction precisely and instantly. Conversely, when grid operators offer incentives for frequency regulation or load shifting, the platform allows data centers to monetize their flexibility, providing a dual benefit to both the grid and the data centers themselves.
Sivaram claims that what really sets EmeraldAI apart from its competitors is its software-only approach. This, he argues, allows the Conducter to be completely hardware-agnostic and scalable across existing and future facilities, which “allows us to deploy across hundreds or even thousands of data centers without redesigning facilities from scratch,” he says. Notably, the system requires no access to sensitive model or training data, alleviating concerns about data protection, and can therefore be deployed across most, if not all, data centers.
Practice makes perfect
This all sounds very good in theory, but what about in practice? To prove its efficacy, Emerald AI has conducted two commercial-scale demonstrations in Phoenix, Arizona, and Chicago, Illinois.
In Phoenix, the company partnered with Oracle on a proof-of-concept demonstration. During the test, the Conductor was able to modulate real AI workloads, achieving 25 percent power reduction over three hours while maintaining workload performance. For Sivaram, the test represented “a clear, measurable proof point that these ideas work in the field – not just in the lab.”
In Chicago, the company raised the stakes. Unlike the Phoenix demonstration, where the team was aware of the workload profiles before the demo, in Chicago, the Conductor handled unknown, random workloads. This demonstration presented a significantly more challenging environment and, according to Sivaram, the platform proved resilient.
“We were pleasantly surprised at how robust the system was,” he recalls. “Some AI workloads malfunctioned mid-run. Emerald Conductor adapted automatically, ensuring stable power consumption that remained below the grid-defined power response target. It showcased the power of autonomous AI orchestration to gracefully maintain workload performance and meet grid needs.”
The company intends to undertake several more demonstration projects in the US, including a geographic workload migration project in late 2025. It is also engaging with regional transmission operators, such as PJM, to explore a large-scale rollout. In October, Nvidia announced it would deploy the Conductor platform at a data center it is building with Digital Realty in Manassas, Virginia. It was also reported that the GPU giant has invested in Emerald AI as part of an $18 million funding round.
Emerald’s next project, however, has an international flavor.
International ambitions
While the US remains EmeraldAI’s primary market, the company is also looking overseas. In August, it announced a partnership with National Grid, the UK’s electrical and gas system operator, and as this magazine goes to press, a live trial is taking place to test Emerald’s system under UK conditions.
“This is our first step internationally. We want Emerald Conductor to become a global standard for grid-friendly AI data centers,” Sivaram says.
So why the UK? For Sivaram, it was down to three reasons. First was National Grid’s forward-looking strategy, with the company openly aware of the possibilities the Conductor system could offer utilities to support the grid while maintaining a consistent connection between data centers.
A point keenly pointed out by Steve Smith, chief strategy and regulation officer at National Grid, at the time of the announcement: “As the UK’s digital economy grows, unlocking new ways to flexibly manage energy use is essential for connecting more data centers to our network efficiently.”
The second reason was National Grid’s transatlantic stature – as an American company active in both the UK and US markets – and its commitment to the technology. “They’ve invested in the program and agreed to a demo, which makes them the ideal partner for our first international launch,” says Sivaram.
The final, and most important, factor, notes Sivaram, was the access to the NextGrid Alliance, a consortium of 150 utilities worldwide. By gaining access to such a robust partner network, the deal could serve as a springboard for further international projects.
This aligns with the company’s broader partnership approach. Emerald AI has already leveraged Nvidia’s cloud partner network to test its technology across US data centers, laying the groundwork for broader deployment and continued global collaboration. Through the National Grid deal, Emerald AI hopes to exert the same leverage across the utility sector.
Net positive
For National Grid, the decision to partner with Emerald AI was based primarily on the potential impact it could have on how utilities view data centers on the grid.
Therefore, for Sivaram, illuminating the platform’s capabilities across as many stakeholders as possible is crucial in redefining the relationship between utilities and data centers, towards a future in which utilities could begin competing to connect data centers to their networks.
“Flexible data centers become assets. Utilities may even compete to connect you first,” says Sivaram. Subsequently, we could see data centers that use systems such as the Emerald Conducter benefit from “advanced interconnection,” argues Sivaram, jumping ahead in the queue for grid access, cutting connection times by months or even years, acting in a similar vein to battery energy storage systems (BESS), which earn priority access and lower costs for their role in stabilizing the grid.
In addition, with increased flexibility, the demand for massive build-outs of transmission infrastructure could abate, “making it possible to connect far more data centers with less immediate grid expansion,” contends Sivaram. He explains: “I’m not saying we won’t prudently upgrade networks – we will – but if data centers can be reliably flexible, we can avoid a lot of rushed, expensive reinforcement and reduce upward pressure on consumer electricity prices.”
A vision for the future
Looking ahead, Sivaram envisions two potential futures. In one, inflexible data centers overburden the grid, leading to blackouts, higher costs, and community resistance. In the other, data centers become active participants, stabilizing the grid, reducing costs, and driving economic development.
“AI factories can be the best friend a grid has ever had. If orchestrated correctly, you get a more reliable, cleaner, and more affordable energy system while powering the AI revolution,” Sivaram says.
However, with the company still in its infancy, we must wait to see whether it can, in fact, hit the right notes and become the power grid’s perfect conductor.
Read the orginal article: https://www.datacenterdynamics.com/en/analysis/in-perfect-harmony-how-emerald-ai-is-turning-data-centers-into-flexible-grid-assets/









