Defence tech’s rise means that, as global conflicts multiply and grow in intensity, decisions over which countries get access to which next-generation battlefield technologies are increasingly happening at young, private companies.
The defence industry has historically been the preserve of several well-established ‘primes’ — like the UK’s BAE Systems, the US’s Lockheed Martin and France’s Thales — which have long-term, direct relationships with governments. And these companies have long been controversial for their products.
But as the technology that companies sell to governments becomes increasingly autonomous — with AI embedded into drones and surveillance systems — and with more young upstarts navigating military procurement, questions over who they’re selling to, and what processes they have in place around that, are surfacing.
“There should be a debate about this in society, and in companies,” says one defence founder, who requested not be named.
How startups decide who to sell to
Many countries have export controls — restrictions on the trade of goods or services to certain governments; British companies, for example, can’t sell to countries including Russia, China, Iraq and Iran; German companies can’t sell to many of those same countries plus the likes of Syria.
But startups can — and do — decide their own blacklists too.
“There’s a big no-no list: so, China, North Korea, Iran, they’re just off. After that, it becomes ad hoc,” one dual-use founder, who requested anonymity to speak freely, tells Sifted.
For potential sales, the founder adds, their chief of staff will look into the mechanics — and consequences.
“There’s a logistical part: what certification do we need for that country? What carrier can we use, and what are the tariffs, if there are any? … If that all goes, tick, tick, tick, then we basically look at our current customer base and say, ‘Who will get pissed off if we sell to them?’” they tell Sifted.
Europe’s best-funded defence startup, German AI unicorn Helsing, told Sifted the company’s customers are “primarily NATO member states and allied countries”— including Ukraine. Each potential deal involves a “diligent screening of persons and business cases” — and a discussion of whether or not it will support Helsing’s central mission: “protecting our democracies”.
Helsing says it uses an independent democracy index as a starting point, but beyond a set of countries that are “clear-cut” democracies, Helsing’s management decides whether or not to work with them.
Working with ‘democracies’
Those like Meredith Veit, a technology and human rights researcher at the international NGO Business & Human Rights Resource Centre, believe that, speaking more generally of defence companies, just working with “democracies” isn’t enough. There are “companies who are purportedly the arsenals for democracy, [but] if you’re missing the international humanitarian law piece […] then they shouldn’t be able to market themselves as such,” she argues — for example, if they’re selling to democracies but their technology is being used by bad actors or in unethical ways.
Veit believes there should be “financial market consequences for companies that actually don’t uphold themselves to these very idealistic marketing slogans that they’re fuelling into the hype.”
Helsing told Sifted that, for “edge case” countries, it seeks “independent advice and engages as many employees as possible in the company’s decision-finding. Ultimately it is the Helsing founders, as the responsible executives, that have to take the decision of whether to engage with a country.”
The first defence founder Sifted spoke with echoed the need for a “culture of internal debate” among employees. “What you don’t want is to be presented with a grey area opportunity and not to have debated it previously,” they said.
Michael Putz, the founder of Austrian startup Blackshark, which creates 3D digital twins for military, intelligence and disaster response operations, says a focus on democracies doesn’t work in practice.
“We quickly realised that limiting ourselves to working only with ‘democracies’ sounds principled, but breaks down in practice. Real-world use cases — like supporting disaster response or civilian protection — often happen in countries that don’t fit neatly into that definition,” says Putz. Instead, Blackshark has an ‘ethics council’ which assesses each case individually.
‘Export controls shouldn’t be considered as a green light’
Agris Kipurs, founder of Latvian advanced autonomous systems startup Origin Robotics, takes a simple approach: although the company’s focus is primarily on Europe, it will sell to anyone it can get an export licence approved for.
“I think it makes a tonne of sense to leave it to those who understand geopolitics way better than any founder does,” he says. “I don’t think that I’m more competent than the Ministry of Foreign Affairs of Latvia. I trust the job that they do.”
“It’s a very straightforward topic and I don’t know why one would even want to complicate it,” Kipurs says. “That only leads to more issues and additional complexity in terms of running the business.”
Veit takes a different stance, arguing that defence companies lean too heavily on the parameters of export controls without interrogating additional human rights considerations — especially for some of the burgeoning AI tech.
“Many companies overly rely on export controls” she tells Sifted. “Export controls shouldn’t be considered as a green light, but currently it is; it just reflects whether or not countries are friends.
“The situation with Israel [and its strikes in Gaza] is highly contentious, and they’re on the list of friendlies for a number of countries still at the moment, even regardless of the grave allegations at hand.”
Monitoring the use
After startups decide to sell to a country, there’s a secondary issue: ensuring that the customer country is using their product for the intended use case.
The first defence founder Sifted spoke with said that, while some companies stipulate end use-cases in the contracts they sign with customers, enforcing it is, in some cases, near impossible.
It’s very hard to track how individual bullets are used on the battlefield, they say, whereas the location of aircraft carriers is something that armies can monitor more precisely.
“It has to come back to whether we trust the institution itself,” the founder says, adding that their company aims to use think tanks and research to assess things like whether a country has strong rule of law, neutral law enforcement and whether their military operates in line with humanitarian law.
The other complexity, the founder notes, is that companies often sign lengthy contracts with governments, during which a country’s political climate or even regime can change.
“You may take a contract with a country for 10 years, so you need to look at the political stability and the direction of travel politically,” they say. “Difficulties arise when the starting point differs from what they end up doing.”
Things are even more complex if the technology is dual use, says Jack Wang, partner at Berlin-based VC Project A, which has invested in several defence startups including German autonomous drones maker Quantum Systems and German autonomous vehicles startup ARX Robotics.
A big reason for that is “if you sell to the military it is usually based on a specification requirement, often in a tender process for publicly searchable programmes — which by definition spells out what their use case is.”
But dual-use companies may be in a trickier position, Wang argues: he says that unless companies have access to their products after they sell them, it can be hard to make sure the customer is using them as intended. He says most “knowledgeable” investors assess the startup’s “know your business” efforts (a process where the company verifies their customer) on a “best effort” basis.
It’s a key concern for human rights researchers like Veit. She points to the United Nations’ guiding principles around heightened human rights due diligence in wartime; “You’re supposed to ask more questions, you’re supposed to have a closer relationship with the client or the end user in order to understand how your products or services are being used and hopefully not abused,” she says.
While Veit says her organisation hasn’t necessarily observed great compliance with this, they have seen an enforced push for better oversight of use-cases in the Russia-Ukraine war. “We’re not seeing anything with that regard when it comes to Israel and Palestine at the moment,” she adds.
Controversial countries
While some countries are unambiguously off limits for startups to sell to, like North Korea or Russia, there are other countries that founders and VCs are currently mulling over.
The dual-use founder mentioned Azerbaijan, Colombia, Guatemala and Venezuela as iffy for them to sell to.
“Countries in the Middle East region including Israel are currently being debated by a lot of defence tech startups in Europe,” Wang adds. He says the sentiment among Western countries around Saudi Arabia has changed more positively in the last decade, particularly in light of recent US-Saudi Arabia investment partnerships.
But, he adds that Israel is “actually not a target market because their defence tech is generally more advanced than Europe’s.”
‘There’s very little transparency’
As more defence tech startups pop up and raise funding from increasingly eager VCs, the need for due diligence on both companies and countries is front and centre.
Wang says that founders and investors are typically aligned about which countries they should sell to, but that there’s an important know-your-business process for the startup. “Usually we [investors] need to see some tooling and process they perform that is similar to the checks banks do for new customers,” he says — to ensure that, for example, a drones startup like Quantum Systems is “not just selling drones to anyone who says they are who they are.”
Still, some believe investors and companies need to increase their scrutiny.
“Due diligence is extremely reactive,” argues Veit. It’s a longstanding issue: a Business & Human Rights Resource Centre report from December of 2023 found that top VC firms were failing in their responsibility to respect human rights in relation to new GenAI technologies.
Veit adds: “There’s very little transparency, and there’s a lot of seemingly hyped marketing related to the need to defend democracy — but there’s no actual proof or public policies or procedures that explain how these tech companies guarantee that they are only selling to democracies.”
Read the orginal article: https://sifted.eu/articles/defence-tech-startups-sell-to-countries-controversy/