Sign up for our newsletter
Technology / AI and automation

European Commission says €31bn cost estimate for AI rules is ‘flawed’

A US think tank says the Artificial Intelligence Act will cost the European economy €10bn a year in compliance costs by 2025, but the European Commission disputes those figures.

The European Commission (EC) has dismissed as ‘flawed’ a US think tank’s estimate that its proposed new AI rules would cost the European economy €31bn over the next five years. But the Centre for Data Innovation (CDI) stands by its forecast, arguing there is “no urgent need to rush to regulate” the use of AI in the EU. The dispute reveals the increasingly contentious role that the EU is playing as a technology regulator, as its rules impact organisations around the world.

EU Artificial Intelligence Act
A US think tank argues that the European Commission’s proposed AI rules would cut investment by 20%. (Photo by Tobias Arhelger/Shutterstock)

In April of this year, the EC unveiled its proposal for an Artificial Intelligence Act (AIA) to regulate the use of AI. If passed by the European Parliament, the AIA would affect any company operating in the EU market that develops or wants to adopt machine learning-based tools as defined by the AIA.

Under the AIA, AI systems classified as ‘very high risk’ (such as the use of facial recognition technology in the private sector) would be prohibited outright. ‘High-risk’ AI applications – those that could pose a threat to someone’s safety or livelihood – would require inspection before deployment to ensure that systems are trained on unbiased data sets and with human oversight. Providers of ‘high-risk’ AI systems would have to demonstrate that a quality management system (QMS) is in place, among other measures.

Mimicking GDPR, businesses that do not comply with the regulations could be fined up to €30m, or 6% of their annual turnover, to be enforced by member state regulators.

White papers from our partners

How will the Artificial Intelligence Act impact the European economy?

According to the CDI, which is part of the pro-industry US think tank Information Technology and Innovation Foundation (ITIF), the AIA “will be the world’s most restrictive regulation of AI tools”, which would curb AI use and development, and would likely lead to a “brain drain of European entrepreneurs with business ideas for AI”. It adds that although “well intended”, the AIA will severely impact Europe’s economy by making it less competitive.

Drawing on the EC’s own impact assessment for the AIA, the CDI estimates that the rules will cost EU business an additional €31bn in compliance costs between now and 2025, by which time they will reach over €10bn a year. It also predicts that the new AI rules will reduce European investment in AI by 20%, or €10.9bn, over the same period.

Responding to the report, an EC official said that it “disagrees with the findings which appear to be flawed. The costs referred [to] in the report are largely overstated, partly due to misinterpretations of the Commission’s regulation proposal and its impact assessment,” the official said in a statement.

The EC argues that the CDI report overestimates the number of AI systems that will be subject to stringent compliance obligations. “In particular, the claim made that by 2025 the compliance cost caused by the AI Act will rise to more than €10bn per year misses the key point of the proposal – that it applies only to high-risk AI applications,” they said. “Instead of multiplying the percentage cost with the investment for high-risk AI applications, the numbers in the report are a result of multiplying with all AI investments.”

Benjamin Mueller, author of the CDI report, replied to this comment saying that tallying up the gross value added of the sectors covered by the ‘high risk’ designation shows that up to 35% of the European economy is exposed.

“The Commission, by contrast, assumes (arbitrarily) that only 10% of AI systems will be considered high risk,” Mueller told Tech Monitor by email. “It should also be noted that the vagueness and expansiveness of many technical definitions in the Act, as well as the ability for regulators to expand the ‘high risk’ list, mean that the share of AI systems exposed to the regulation is, if anything, going to go up over time.”

In response, the EU official said that “it is incorrect to assume that all AI applications used in the sectors or ‘broad areas’ listed in Annex III of the regulation proposal will be considered as high risk.”

The EC spokesperson also argued that the percentage applied in the CDI report does not take into account the argument, presented in its AIA impact assessment, that normal companies need to ensure robustness and accuracy anyway to sell their products in competitive markets, and this is already part of their business-as-usual costs.

In response, Mueller told Tech Monitor: “If the Commission believes that to be true, surely that undermines the need for the law in the first place.” The EC spokesperson replied again, saying that the “public interests that the regulation aims to protect, notably fundamental rights and safety of people, cannot be left to the good will of companies and other actors”.

Compliance costs of the Artificial Intelligence Act

The EC also rejected the CDI’s assessment of the AIA compliance cost for small business. The CDI report claims that a small businesses (one with up to 50 employees or €10m turnover) “can expect total compliance costs of up to €400,000 for one high-risk AI product requiring a quality management assessment”, based on the EC’s own assessment of the regulation proposal. However, the EU official said that this claim is false as the CDI report confuses the AIA’s impact assessment with a support study that the EC requested from the European Investment Bank.

“Regarding the cost, the section on smaller companies in the support study actually quotes a third expert on companies of 100 – not a small company of up to 50 – employees,” the EU official said. “Most importantly, though, the estimates by this expert of up to €400,000 – based on the medical devices sector – are for companies that would be obliged to introduce a new quality management system from scratch and the maximum possible costs.” This, they add, would be a “truly exceptional case” as most of these companies are already subject to product safety legislation and have QMSs in place, so they would not incur any additional costs.

Mueller insisted on the accuracy of his calculations and referred to the EC’s impact assessment study of the AIA, which states that “[t]he set-up of a QMS and the conformity assessment process for one AI product is estimated to cost between €193,000 and €330,050. An estimated additional yearly cost of €71,400 will also be borne by the company to maintain compliance over time.”

He added that the assumption that most small companies already have QMSs in place only reinforces the notion that the AIA will benefit market incumbents and create high barriers of entry for new businesses: “If the Commission truly believes that companies will not bear costs for QMS systems, then they should put their money where their mouth is and offer to reimburse companies for any outlays to comply.”

“The Commission fully agrees with the need to support new players to invest in AI solutions and help them comply with the new rules,” the spokesperson said in response. “This is notably the purpose of the Coordinated Plan on AI, which was adopted together with the AI proposal, and which includes numerous actions and support measures to this end.”

AI has "no urgent need to rush to regulate"

Although Mueller stressed that his paper does not argue against all regulation, he said that there is “no urgent need to rush to regulate” since existing EU legislation already regulates labour rights and outlaws discrimination and abuse of market power.

“If these laws turn out to be unsuitable in the context of AI, legislators should respond on the basis of evidence. Pre-emptively regulating a new technology not yet widely adopted is a uniquely European approach,” Mueller said, adding that the EU economy is already uncompetitive in digital markets. Instead, he suggested a regulatory framework that focuses on “algorithmic accountability” – a concept by which operators of AI systems are held accountable for the results of their algorithms and AI tools.

The EC spokesperson said the CDI report ignores the potential upside of the AI rules, “in terms of greater legal certainty, public trust and acceptance for the uptake of trustworthy AI solutions, and the positive reputational benefits that can actually stimulate companies to invest in and develop such AI solutions in Europe, increasing also their competitiveness to export them abroad.”

“There is no evidence that the much-cited ‘trust gap’ exists and is behind the lack of AI start-up success in Europe,” Mueller said. “This is a talking point regulators roll out to justify their interventionism – the cost of these interventions is then pooh-poohed and belittled.”

The AIA is unlikely to be enacted before 2023, at the earliest, and will be subject to considerable lobbying, debate and revision before then.

Cristina Lago

Associate editor

Cristina Lago is associate editor of Tech Monitor.