If Batman built data centers, they would probably look a lot like Lefdal Mine Datacenter.
Located 60 meters (197 feet) below sea level and 700m (2,297 ft) inside a mountain, the formerly abandoned, but now very much repurposed, olivine mine is bathed in green light and soundtracked by the noise of construction work that echoes throughout its cavernous interior.
Described by those who work there as a “city of data centers” – the facility’s third level is 300 meters long and consists of 14 ‘streets,’ connected to an ‘avenue’ – data centers don’t come much more rugged (or Batcave-like) than this.
But Gotham City, this is not. Visiting this underground ‘city’ – for those not already living in the picturesque town of Nordfjordeid, west Norway, at least – requires an hour-long flight on a propeller plane from Oslo to Sandane, followed by a short ferry ride across a fjord, and then a drive along a winding road that hugs the edge of the water.
It’s an unassuming location for such a formidable structure, although don’t let its Instagrammable surroundings fool you. Lefdal is keen to flex its security credentials and reassure both current and prospective clients, citing specially designed security gates, two-factor access control systems, required tracking devices for everyone underground, and agreements with the local police and fire departments, as just some of the measures the facility has in place.
“This is one of the most secure facilities in the Nordics,” Lefdal’s chief commercial officer, Mats Andersson, says.
River deep, mountain high
For a data center, location is everything. Despite its sprawling internal footprint, from the outside, Lefdal has an incredible minimal visual footprint, blending in rather seamlessly with the surrounding rock face. This is certainly something of a benefit in a world where data centers are increasingly seen as the enemy, with Andersson telling DCD that in the entire time the data center has been open, not a single complaint has been raised by locals.
“A data center like this would normally, on the outside, be a big gray building with no windows and in the way of the neighborhoods and municipalities, but we’re hidden,” he says.
More importantly, however, the region in which the data center sits offers up to 1GW of hydro and wind power. During the last 20 years, Norway has boasted some of the lowest energy prices in Europe, with increasing production of renewable energy resulting in a surplus that ensures costs will remain low in the future.
Unsurprisingly, the facility is powered entirely by hydro and wind and has a guaranteed PUE of 1.15, and a WUE (Water Usage Effectiveness) that is “close to zero” – it has no evaporative systems for cooling, instead using water from the nearby 500-meter-deep fjord. Lefdal claims it can offer some of the lowest power prices in Norway.
Lefdal opened as a data center in 2017 and, at present, the 120,000 sqm (1.3 million sq ft) facility is operational across three levels, although a fourth and fifth have been earmarked for future development.
Its data halls are currently located on two of the levels (Level 1 exclusively houses the data center’s power and network infrastructure), with the Phase I build out across Level 3 totaling 80MW, and Phase II on the second level totaling 120MW.
In April 2023, Lefdal added a connection to the regional 132kV power network to increase the available power to the facility, with the dual supplies each expandable up to 200MW of capacity. Meanwhile, the data center’s close proximity to its local renewable power sources also means that it has minimal transmission losses.
For the third build-out phase, which will sit on Level 4 of the mine, Lefdal is hoping to add an additional 200MW of capacity to the site. At present, a total of 80MW of capacity is being utilized, with 10 percent set aside for use by the Norwegian government.
Despite the large amounts of power flowing to the site, Lefdal claims to be carbon neutral and is currently working with the Norwegian government on a scheme that will see its heated water reused to power a nearby salmon hatchery. Should that plan be approved, Lefdal says it would support the production of six million salmon annually, contributing to a 20-25 percent heat reuse efficiency.
“Once we have completed that, I would say that we are the greenest data center, at least in Europe,” Andersson says.
For clients who want to move into the underground data center, it typically takes about 12 months from the signing of the contract to deployment. Lefdal’s standard modular data halls are 12 meters (39 ft) high and four (13 ft) meters wide, but can vary in length, offering up to 800 sqm (8,610 sq ft) of whitespace per floor. Each data hall provides 10-15MW of capacity and is constructed as an individual building, powered by a shared power, cooling, and network infrastructure.
Lefdal says it can customize power density, temperature, humidity, operational equipment, tier levels, and all related services in its data halls. For air-cooled environments, inline cooling is used to transform cold water from the fjord into cold air, enabling densities of up to 50kW per rack in one of Lefdal’s 10MW halls.
At present, direct liquid-cooling solutions are available from HPE, Lenovo, Dell, and Kaytus, supporting 300kW per rack deployments. Lefdal says it is currently designing 400kW per unit solutions, in addition to working with “leading immersion cooling providers.”
The data center uses cold water from the fjord, which reacts with the heat exchangers and a cold freshwater circuit to provide chilled water under the raised floor. Like the data halls, the heat exchangers are also modular, and each has a 7.5MW capacity. There are 12 heat exchangers on Level 3, supplying a total of 90MW cooling capacity.
Introducing Olivia
One such customer that opted to deploy its high-performance compute (HPC) infrastructure at Lefdal Mine Datacenter is state-owned company Sigma2, a firm responsible for providing national infrastructure for computational science in Norway.
In early September, DCD, along with a host of other guests, was invited to Lefdal to get a closer look at Olivia, a 225 million NOK ($22.6m) system built to support HPC and AI applications.
Inaugurated in June 2025 and named Olivia in tribute to the mineral olivine mined where the data center is now housed, the HPE Cray Supercomputing EX system will support research across various sectors, including climate, health, oceans, and artificial intelligence (AI).
The system comprises 304 Nvidia GH200 GPUs, 64,512 AMD Epyc Turin CPU cores providing 13.2 petaflops of sustained Linpack Performance, and is interconnected by HPE Slingshot, and also boasts 5.3PB of storage capacity. It has a power consumption of just 219kW and offers 60.274 gigaflops of performance per watt.
On the most recent edition of the Green500 list of the world’s most energy-efficient supercomputers, Olivia ranked 22nd. It ranked 117th on the Top500 list of the most powerful systems.
Helge Stranden, senior advisor, HPC and storage infrastructure at Sigma2, says that hosting Olivia at Lefdal was born out of a desire to future-proof the long-term predictability of the space and physical infrastructure needed by supercomputers.
Until 2017, Sigma2 had four HPC facilities, located at the four major universities in Norway – Tromsø, Trondheim, Bergen, and Oslo – which form a collaboration known as NRIS (Norwegian Research Infrastructure Services).
The Norwegian University of Science and Technology (NTNU) in Trondheim houses Olivia’s predecessor, the 6.2 petaflops Betzy, while the 375 petaflops Lumi – co-owned by Sigma2 and the EuroHPC JU – is housed at CSC’s data center in Kajaani, Finland.
However, the company decided it wanted to reduce the number of locations to two and shore up the long-term predictability of both the space and physical infrastructure necessary for future supercomputing deployments, which Stranden says has always been a challenge in the university buildings, as they were not built with the intention of housing large-scale HPC offerings.
“At the same time, Sigma2 wanted to explore the possibility of having an external site,” Stranden says. Seven data centers across Norway put forward their offerings, but ultimately Lefdal won out, and the two companies signed a contract in 2021.
When it came to selecting a data center, Sigma2 judged all the prospective facilities on three main categories: price and technical performance, social and environmental performance, and the quality of service.
“Seventy percent of the infrastructure suppliers in Lefdal are local to this area,” Stranden says. “You don’t need to get there by plane to do support. Also, what we have experienced working with Lefdal Mine Datacenter staff and partners is that it feels like being a part of one big family, and that’s just really nice for us. It works very well that way.”
Lefdal’s predictable scalability, high levels of physical security, and “excellent separation” between colocation customers, as well as the salmon hatchery plans for excess heat reuse, also factored into Sigma2’s decision-making, he added.
While Olivia only came online four months ago, Stranden says early feedback from the supercomputer’s pilot users has been positive. Compared with Betzy, customers have reported that CPU jobs run more than 2x faster on Olivia, while GPU jobs are reportedly 3x faster than on Lumi, when using the same amount of resources.
Lefdal beyond
Although most supercomputers typically have a life span of five to seven years, Andersson says the data center is able to draw up contracts that last 15 years or more, because customers see that the mine’s abundant power and space can provide a solid home for its compute infrastructure long into the future, not just in the short term.
Part of this future-proofing effort is prepping data halls for the next generation of racks – early this year, AMD unveiled its double-wide Helios rack-scale system based on the company’s forthcoming MI400 series of GPUs.
“Today, we’re rolling in pre-configured racks into data halls, and [they] weigh a lot,” Andersson says. “But three to four years from now, we’re expecting racks that might need to be transported on skids three meters (ten feet) by six meters (20 ft) by three meters, weighing 35,000lbs. A unit of AI, fabric-tested and purpose-built in one piece.
“So, when we talk to our clients, when we build these data halls, we are now preparing for wider doors and more solid floors. We’re preparing for weights above that and for even bigger skids. These are things we need to think about already, because when we build these data centers, they need to be future-proof but also support what’s coming on the logistics side.”
Andersson adds that for Lefdal, those conversations are easier to have because the data center often works with customers who hold regular meetings with Nvidia and AMD to talk about their pipelines.
“Always remember, if we were a data center in a normal building in a town somewhere with five stories, we would be worried about one megawatt per rack in a different way than we are [at Lefdal],” he says. “We’re lucky not to have to think about the cooling and the densities in the same way as they are.”
Read the orginal article: https://www.datacenterdynamics.com/en/analysis/data-mining-lefdal-mine-datacenter/








