Some interesting matter has come out of the 2010 Top 500 Supercomputer survey of global supercomputer capacity – and yes, one of them is that the People’s Republic of China has, surprise surprise, significant HPC (high performance compute). But that’s far from being the only notable thing in the poll.
For a start, the popular press is far too interested in raw speed. Yes, a petaflop is a very fast thing: 1,000 trillion calculations per second (as you will remember from Computer Science 101, and/or your handy Wikipedia cheat sheet). And yes, the ‘No 1,’ the Cray set up at the Oak Ridge National Laboratory in Tennessee, has a top speed of 1.75 petaflops and the Chinese closest match is rated at a top speed of 1.20. For the record, two of the PRC arrays are now ranked as in the top 10 fastest computers in the world.
But speed isn’t the only metric for ranking an HPC. Storage capacity is also very important as is, by definition, the ability of the machine to access and keep safe the large datasets one uses such an array for. That has to be borne in mind when looking at this data. Therefore the number of cores (processors) is also worth recording, as this list does, as well as interconnects (which it doesn’t) and allied storage stack (ditto).
Where does the UK fare in all this? We have some interesting results and have (cue Union Jack waving, if so inclined) some 38 machines ranked. As might be expected, defence, financial services and academic research facilities are all well represented. The ‘fastest’ is the University of Edinburgh supercomputer, with a top speed of 0.27 petaflops (so Number 16), and with nearly 47,000 cores. As one of the global centres for AI research (a notorious eater of run-time computing!), no surprise there. Also in academe Southampton University, which we recently featured in CBR, comes in at 83, with 8,000 cores.
Like the US, in government we like to run a lot of nuclear war and weapons simulations (probably on balance better than doing it ‘live’). Hence supercomputer capacity (a Bull cluster, 3,744 cores) at the Atomic Weapons Establishment. And yes, Spooks fans – there is a Classified element, albeit disappointing for James Bond fans it’s quite a small one, an IBM one at 3,184 cores.
But then we have some intriguing additions. Yes, the City has a few – we had to have some way of working out how to package all those CDOs, after all, and one would be kind of surprised if the Met Office wasn’t on here (sleep easy – it is). But why does the ‘Food Industry’ need so many? At least three are listed (though not named). Does Tesco need one to work out what on Earth to do practically with all the data lying around on those storecards? R&D on new ways to make turkey twizzlers?
But there is an interesting additional name in here that struck me: Computacenter (UK) is identified, along with a couple of unnamed ‘IT service providers’. It has to be down to the managed services element of its work, but then would that normally be covered by a data centre or two? Or is it powering its Cloud that way? We tried to find out by asking but the firm must have been busy swapping out a core or two and didn’t get back to us by deadline. We’ll keep trying and if we get an answer, come back to you.
Put all this together and we have a picture that says the global capacity for high-end computing remains strong – but the story, I think, is less about national virility and the rise of China but the range of applications they are used for.
That food industry requirement remains puzzling….
The BBC has quite a nice infographic to help you dive into the list here.