As liquid cooling takes hold in data centers around the world, the market is awash with vendors hawking products that cater to every step of the supply chain.
Against this backdrop of heightened activity and myriad options, developing a bespoke solution might seem like an expensive and time-consuming fool’s errand. But OVHcloud views hings differently.
“The philosophy OVH has today is ‘own the design, own the impact’,” says Ali Chehade, head of cooling R&D at the European public cloud provider, which has just launched a novel cooling system of its own making that has been in the works for the last two years.
He continues: “Sometimes that can be a hard thing to put in place. It took us years of R&D, a lot of investment, and we had to run several PhD programs, but now we own the whole infrastructure and we own the design, so we can say we own the impact.”
The impact of the new racks, dubbed the OVHcloud Smart Datacenter, is said by the firm to be significant. By rethinking the design of its cooling system, OVH claims it can reduce energy consumption by 50 percent and water usage by 30 percent, and serve thousands of servers with a single coolant distribution unit (CDU).
Already installed at one of the firm’s data centers in France, it intends to roll the technology out across its IT estate around the world.
A smarter solution
Chehade has been at the helm of the smart data center project. A mechanical engineer by background, he joined OVHcloud a decade ago after completing a PhD at Orange Labs, where he developed a design for a passively cooled telecoms cabinet, utilizing the unit’s design features, rather than a mechanical system, to provide cooling to its components.
Though he contends liquid cooling has been at the heart of OVHcloud’s philosophy for more than two decades (the firm first started working with the technology in 2003), Chehade says interest has ramped up in the last three years, as the AI revolution has driven a rapid increase in component temperatures and rack densities.
“Before then, people were not motivated to do it [switch to liquid],” Chehade says. “But now it’s becoming obvious that IT equipment cannot be cooled by air alone, so there’s a need to address this issue because you can’t install racks or servers without liquid.
“For some companies, liquid cooling and the sustainability that comes with that is just a marketing message or an afterthought. We’ve been doing this for more than 20 years, so it’s part of the game. We always say that sustainability is frugality, and that’s what the new Smart Data Center enables.”
Horizontal scaling
OVHcloud kicked off the rack redesign two years ago, when the AI revolution was still in its infancy. Development work has been carried out at the firm’s R&D center in Croix, just outside the city of Lille in northern France.
The company had already been deploying a custom rack design, with server units arranged horizontally, rather than vertically, for several years, and Chehade says there was a recognition of a need to take this to the next level.
“Our racks, like any others on the market, contained an in-rack CDU, a cooling unit with a pump and valves and sensors,” he explains. “That pump consumes energy, so we started to look at how we could reduce that energy cost and make them more sustainable.”
The initial change OVH’s engineers made was to move the CDU outside the rack (it sits at the side of one of the company’s horizontal units). This in itself is not particularly revolutionary – in-row CDUs are a fairly common sight in data centers, located between cabinets to meet the cooling needs of a line of servers. However, the way in which the OVH system is set up is somewhat novel.
The company’s engineers started by arranging its racks into clusters, in which servers are connected in parallel, allowing individual units to be swapped out for maintenance without the entire data center’s cooling system having to grind to a halt. This is also fairly normal practice, but OVH switched things up by connecting these clusters together in serial, allowing a single pump to deliver cooling fluid to multiple clusters at once, rather than each having its own individual CDU.
Chehane says there are two main reasons this is possible. One is that the required water pressure for the cooling system is pretty low, meaning a single pump packs enough power to deliver water to several clusters.
The other is that, by removing the CDU from the rack, the temperature of the cooling fluid is lower when it reaches the hardware. “In our old racks, the fluid was delivered via cold plate heat exchangers,” Chehade says. “This meant that though our facility water ran at 35°C (95°F), but the temperature inside the rack was 45°C (113°F).”
With the pump outside the rack and the heat exchangers gone completely, replaced by what OVH describe as “direct-to-chip waterblocks” of the firm’s own design, the liquid now hits the hardware at a lower temperature. This means the same water can be used to cool several clusters of servers before it becomes too hot. “The liquid enters the first rack at 35°C, so the first cluster is chilled to a very low temperature,” Chehade says. “The last cluster is still cooled to a temperature that is still lower than equipment we’ve previously validated.”
Indeed, Chehade says up to eight clusters of servers can be chilled using a single CDU. The differing cooling temperatures achieved in the clusters could also have implications for the type of hardware deployed, with the first clusters in the chain, which get access to the chilliest liquid, naturally lending themselves to the most powerful hardware.
“If we have racks with different types of servers, or clients with different requirements, we can manage this using the mathematical algorithms we’ve developed to help our deployment teams,” Chehade adds.
Small is beautiful
Removing the CDU creates between 4U-6U of space inside the rack, allowing OVH to pack in more servers, Chehade says.
The company has also had some success reducing the size of the CDU itself, and has utilized AI systems, of the type normally found running in its data centers, as part of the project.
“Our old design had a lot of piping at the back, but with the new design, the piping is really simple,” Chehade explains. “That’s because we have a main component [the heat exchanger] that doesn’t exist anymore, so we have more space. Our CDU now takes up 1 sqm (10.7 sq ft), whereas the previous design was 2.5 sqm (26.9 sq ft).
“We have two pumps, one master and one slave, we have a flow meter which gives exact readings on temperature and flow, and we have a small reservoir, and that’s it. The static, mechanical, components, which you often see in designs on the market today, have been replaced by intelligence.”
Where one was a jumbled maze of pipes and cold plates, now sits a box containing a digital system Chehade describes as “the brain” of the Smart Data Center. While some of this technology is still being kept under wraps, with various patents pending, OVH says the CDUs contain 30 sensors that monitor elements from the racks, including pressure, speed, and water temperature, and can adjust cooling settings in real-time.
This optimization of server workloads, Chehade says, has the potential to greatly extend the lifespan of the equipment and helps to optimize the data center’s overall power consumption.
A final element of the system is the smart dry cooler, which is located outside the server room and tasked with chilling the liquid once it has circled the cooling loop. The new design is also 50 percent smaller than its predecessor, according to Chehade, and uses half the number of fans, contributing to the reduced cooling power of the system and reducing ambient noise.
The first deployment of the smart data center is at OVHcloud’s data center in Robaix, France. Currently consigned to a single room of 60 racks and 2,000 servers, all served by one CDU, it will soon be rolled out to other locations.
Chehade says parts of the system could be licensed to other data center operators keen to make their facilities more efficient. “It’s still early days, but we have had several requests from manufacturers keen to buy a licence for some of our patents,” he says. “It may not be the main components, or ‘bricks’ of the system, but it’s something we’re discussing.”
Reflecting on the project as a whole, Chehade describes it as an “intense” experience. “I love this work, it’s my passion, and what we’ve been doing here in Croix is really amazing, we’ve just been building, building, building,” he says. “It’s funny in a way, because when people think of AI, they think of energy consumption and a higher carbon footprint, but we’ve been using AI to reduce our water and electricity consumption.
“We’ve had to change how we think about data centers, and how we design them to fuse AI with industrial design, and create a single ecosystem between software and hardware.”
Read the orginal article: https://www.datacenterdynamics.com/en/analysis/how-ovh-is-cooling-the-cloud/



![ECPEN25-410A_HR[1]_page-0001](https://media.datacenterdynamics.com/media/images/ECPEN25-410A_HR1_page-0001.height-150.jpg)



