Every time you ask ChatGPT to write your investor update, somewhere a data centre hums to life, guzzles electricity and exhales a cloud of carbon. At billions of prompts a day, that adds up to a significant and rapidly growing share of global emissions.
And startups don’t just use ChatGPT. In a recent McKinsey survey, 78% of companies said they use AI in at least one business function.
According to Lawrence Berkeley National Laboratory, AI could soon consume more than half the electricity used by US data centres — equivalent to powering 22% of US households. Globally, the International Energy Agency expects emissions from data centre energy use to double in five years.
This raises an uncomfortable question for founders: Do startups know how green their compute is?
The opacity problem
The first problem is that companies rarely share their AI emissions; the second problem is that measuring AI’s footprint is legitimately difficult.
Essentially all of the numbers that we have are based on estimates.
Training emissions dominate the headlines, but inference — the model responding to your everyday prompts — now makes up 80–90% of AI’s compute demand. That usage changes hourly, depending on the grid mix, model size and even the time of day.
“We have a very vague idea of how much energy AI uses,” says Sasha Luccioni, AI researcher and climate lead at Hugging Face, a hub for AI models, datasets and developer tools. “Essentially all of the numbers that we have are based on estimates.”
But this doesn’t mean companies can’t try to measure their AI usage. Luccioni suggests a lifecycle analysis approach, which takes a broader perspective than just AI model training.
French AI startup Mistral did this recently. In July, it published the results of what it called “a first-of-its-kind comprehensive study” to quantify the environmental impacts of its large language models (LLMs).
It found that training and running its flagship Large 2 LLM produced approximately 20 kilotonnes of CO2 equivalents (CO2e) and consumed 281k cubic metres of water — which, according to The Register, is the equivalent of roughly 112 Olympic-sized swimming pools.
This transparency though is rare. At Hugging Face, Luccioni was part of the creation of the AI Energy Score, an initiative which aims to compare the energy efficiency ratings for AI models. Through it, Luccioni says she tested over 100 open source models but she had trouble getting closed source giants like Google and Microsoft to “play ball”.
“It’s considered a secret trade secret, I guess,” she says. “We do have lots of open source models, but how they measure up to ChatGPT or Gemini is still up in the air.”
Bigger isn’t always better
Another finding from the Mistral study is that larger models incur dramatically larger footprints. Luccioni says this is a key part of the AI Energy Score — encouraging companies to think about what models they are choosing.
“A model can be a tiny bit less high performing, but a lot, a lot more efficient,” she says. “The goal is really to incentivise people, developers specifically, to pick more energy efficient models, when they’re using models and when they’re deploying AI.”
Smaller models are also often more efficient, performing just as well, with less effort. While there used to be a “scale is all you need” approach, that assumption is being questioned. Companies just need to put more thought into the training process.
“Using task specific models makes sense in a business use case, because often you have a specific task you want to do,” says Luccioni. “It costs less and it uses less energy. So it’s kind of like a win-win.”
Users too have a choice and that includes at work. While the environmental impacts of casually using LLMs can be huge and disproportionate, simple models that detect emissions or optimise energy grids could play a powerful role in supporting climate action. A recent paper in Nature argued that fully utilising AI in power, food and mobility could outstrip the increase in data centre-related emissions generated by all AI-related activities.
The infrastructure choice
But even the leanest model is only as green as the data centre powering it. Mattias Åström, cofounder and CEO of Swedish cloud provider Evroc, says location and design matter.
Hyperscale cloud providers (think AWS, GCP, Azure) are significantly more energy-efficient than traditional on-premises data centres, primarily due to economies of scale, advanced technologies and continuous optimisation efforts. Evroc wants to establish Europe’s first sovereign hyperscale cloud.
“It’s more energy efficient for two reasons,” says Åström. “One is that when you build it at scale, you can build more efficient data centres. But the other thing it has to do with how you’re using the resources.”
Evroc builds in cold climates (like Stockholm) and reuses GPU heat for local district heating,but there’s more that could be done.
“What if you could train your AI models in Spain when there is sunshine, move AI training to the Netherlands when there is wind and overnight, you move it to the north of Europe, where you have hydropower,” says Åström.
Ask hard questions
Åström and Luccioni both advocate for more regulation when it comes to disclosing energy usage for AI.
Individuals have a certain amount of agency or power, and companies have a certain amount of power as well.
“In the same way that when you buy a TV today or a washing machine, you have that sort of energy declaration, we need to force companies to disclose how much energy they are really spending on their data centres,” says Åström. “Or, as in Germany, they have actually restricted new data centres needing to have a certain efficiency.”
Because of the lack of disclosure, Luccioni says it’s hard for users and organisations to choose between providers. But she stresses that you can always ask.
“Companies can ask for information as well, it’s not only regulators,” she says. “If you’re wanting to shop around for a tool, I think it’s fair to ask for that information, and maybe it’s not readily available, but at least it will get the ball rolling in terms of getting numbers.
“Individuals have a certain amount of agency or power, and companies have a certain amount of power as well.”
Additional reporting by Maya Dharampal-Hornby.
Read the orginal article: https://sifted.eu/articles/how-green-is-your-compute/