Demand for high-performance computers (HPC), or supercomputers, continues to build as businesses and scientists undertake more data-intensive work. The growing importance of HPCs means they have become a new frontier for territories seeking to bolster their digital sovereignty, and the European Union has big plans to grow its own expertise and resources in this area. However, political differences could hinder the bloc as it attempts to catch up with leaders in the field such as the US and Japan.
Europe’s supercomputing conundrum came into focus last month when the EU’s European High Performance Computing Joint Undertaking (EuroHPC JU) announced it had cancelled the procurement process for the MareNostrum 5 supercomputer. Due to be based at the Barcelona Supercomputing Centre, it would have been Europe’s most powerful supercomputer and, with expected peak performance of 200 petaflops, the second most advanced in the world. But in a statement, the EuroHPC JU said it “did not achieve the needed majority” to select a vendor for the contract, which would have been worth €223m over five years.
Political differences are thought to have been at the heart of the decision to not award the contract, with Politico reporting that countries including Spain favoured a bid from a US-China consortium led by IBM and Lenovo, which offered to deliver the most technically advanced machine, while France was keen to choose French company Atos, which would have made greater use of European supply chains and grown the continent’s autonomy in this key technology area.
If and when the tender process for MareNostrum 5 will re-open is currently unclear, but as political wrangling continues, Europe risks falling further behind its supercomputing rivals.
What makes a supercomputer?
While the term supercomputer may conjure an image of an enormous mainframe machine with myriad lights and dials, the modern HPC is a collection of computers working in tandem. “It’s a network where each computer is a node in the network,” explains Filipe Oliveira, a senior analyst for thematic research at GlobalData.
By connecting these machines you can achieve massive processing power far beyond the capabilities of a single computer. “For the last decade we’ve lived in what we call the petascale, which is the ability to do a million billion calculations on a computer,” says Professor Mark Parsons, director of the Edinburgh Parallel Computing Centre (EPCC) at Edinburgh University, Britain’s biggest supercomputing research centre.
Traditionally, supercomputers have mostly been used for scientific research, but they are “now being used in other areas, such as finance, retail and entertainment,” Oliveira says, particularly as more organisations deploy AI systems that require intensive processing power.
The growing interest in HPC has caught the eye of the hyperscale cloud providers, many of which offer HPC-as-a-service. But Parsons says this is on a much more limited basis than large national systems such as ARCHER, the UK’s national supercomputer which is used by scientists across the country. “The Big Tech companies tell a good story about supercomputing,” he says. “But the scale is much lower. In the cloud you would typically run 20-30,000 cores, whereas ARCHER2, my new system, will have 750,000.
“That’s not to say that what the cloud does isn’t useful,” he adds. “A very large number of simulations can be run on a few thousand cores. But when you’re looking to solve the most difficult challenges you need the national-scale resource which most countries are investing in.”
The current target for investment is exascale supercomputing, which will be an order of magnitude faster than petaflop machines. Parsons is also the director of research computing for the Engineering and Physical Sciences Research Council (EPSRC), and is leading the work to develop an exascale supercomputer for the UK. “We’re just entering the age of exascale at the moment, but it will be a thousand times faster,” he says.
EU supercomputing plans: why has Europe fallen behind?
The first exascale supercomputer is likely to be the Fugaku machine in Japan, which currently runs at 500 petaflops but is set to be upgraded in the near future. An exascale machine is also being constructed in the US, and Professor Parsons says it is probable China is working on a similar project. Of the ten fastest supercomputers in the world, only two are based in Europe, and both offer a fraction of the compute power of Fugaku.
Supercomputing R&D is also dominated by other parts of the world, with research from GlobalData showing that supercomputing-related patents registered in the past decade were mainly filed in the US and China, with the UK, which has since left the EU, being the only European nation to file patents in this area.
Parsons says that though Europe and the UK have both historically played a big role in the global supercomputing ecosystem, this has diminished in the past ten years. "We've never had the same level of spending as the US or Japan, and in the last decade there's been very little investment at all," he says.
Supercomputing has often been used as a tool to boost digital sovereignty, the ability of governments to shape the technology platforms on which their countries depend. "What you see in these other countries is that the governments use the leading edge of supercomputing to support their indigenous IT industries," explains Parsons. "So in the US, for example, Intel, AMD and NVIDIA all benefit from these large system orders from government when they're doing their R&D. Fujitsu would never have developed the A64FX (its latest supercomputer processor) if the Japanese exascale project wasn't happening.
Europe's supercomputing efforts, meanwhile, have been hampered by politics. "A lot of what's going on inside EuroHPC is not about the technology itself, but the geopolitics," says Parsons.
Established in 2018, EURO HPC is a joint undertaking that aims to deploy "world-class exascale supercomputers in Europe". So far it has helped fund seven supercomputers across the continent, as well as other research. But the controversy around the stalled MareNostrum 5 tender laid bare the tensions between providing the best resources for businesses and scientists, which aren't available solely from European suppliers, and the need to boost technology sovereignty by supporting local supply chains.
Oliveira says that, for most end users, "I don't think it matters where the infrastructure is or where the compute power comes from, as long as you get the outcome you need." But, he adds, "from a geopolitical standpoint it does matter because if Europe doesn't have the know-how and the infrastructure it will be dependent on other powers around the world. The US government has changed now, but for the last four years [under President Trump] it wasn't very pro-EU. Having gone through that I think the EU has realised it needs more autonomy in these key technology areas."
Can the EU (and the UK) narrow the supercomputing gap?
The EU has plans to bolster its tech sovereignty in several areas, such as the GAIA-X European cloud computing network. Professor Parsons says some building blocks are already in place to do the same in supercomputing. "Atos has developed systems for manufacturing high-performance computers, and is absolutely a world leader in terms of the quality of its products," he says. "But it has to produce them in conjunction with a company like [US-based] Intel, so the next step in the EuroHPC plans is to create a European processor."
To do this, an off-shoot of EuroHPC, the European Processor Initiative (EPI), has been created, with the aim of providing a low-power European-made processor suitable for supercomputers. A private company, SiPEARL, has been incorporated to manufacture and distribute the Arm-based processor, which the EPI hopes will be powering an EU Exascale supercomputer by 2023.
The MareNostrum 5 affair demonstrates how an age-old EU problem – marrying up the different priorities of the member states – could hold back further development of European supercomputers, Oliveira says. "Conflicts between people in France and Spain can seem minor, but multiply that by 27 and it's always going to be a challenge, whereas in the US or China they have more control," he adds. "But I think it's possible [to narrow the gap] if they can focus on finding an agreement. The EU knows it needs to get its act together on this."
Now free of EU politics, the post-Brexit UK also has plans for supercomputers, and Parsons says the government will be releasing a strategy around large-scale computing in the next few weeks. "The EPSRC plan is to have an Exascale supercomputer up and running in the UK by 2025," he says. "That would make us competitive with Europe. We'd still be a wee bit behind the US, China and Japan, but it would put us at the level we should be as a science superpower."