View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
July 31, 2009updated 19 Aug 2016 10:06am

Supercomputer produces pretty picture

Take a butcher's at the below: pretty, isn't it? It's the visualisation of a supernova performed on a supercomputer in the U.S. Department of Energy's Argonne National Laboratory.What's perhaps more interesting though than the fact a multi-million

By Jason Stamper Blog

Take a butcher’s at the below: pretty, isn’t it? It’s the visualisation of a supernova performed on a supercomputer in the U.S. Department of Energy’s Argonne National Laboratory.

supernova.jpg

What’s perhaps more interesting though than the fact a multi-million dollar computer can draw a pretty picture, is that it’s all now being drawn on the supercomputer itself, rather than the numbers crunched and then visualised using different software on graphics processing units.

To produce the image on Argonne’s Blue Gene/P supercomputer, 160,000 computing cores all work together in parallel. Today’s typical laptop, by comparison, has two cores. In fact if you wanted to try and do this kind of picture on a typical home PC, it would take you three years just to download the data.

The latest volume rendering techniques being used by Argonne can be used to make sense of the billions of tiny points of data collected from an X-ray, MRI, or a researcher’s simulation.

Usually, the supercomputer’s work stops once the data ha­s been gathered, and the data is sent to a set of graphics processors (GPUs), which create the final visualizations.

But the driving commercial force behind developing GPUs has been the video game industry, so GPUs aren’t always well suited for scientific tasks. In addition, the sheer amount of data that has to be transferred from location to location eats up valuable time and disk space.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

“It’s so much data that we can’t easily ask all of the questions that we want to ask: each new answer creates new questions and it just takes too much time to move the data from one calculation to the next,” said Mark Hereld, who leads the visualization and analysis efforts at the Argonne Leadership Computing Facility. “That drives us to look for better and more efficient ways to organize our computational work.”

Argonne researchers wanted to know if they could improve performance by skipping the transfer to the GPUs and instead performing the visualizations right there on the supercomputer. They tested the technique on a set of astrophysics data and found that they could indeed increase the efficiency of the operation.

“We were able to scale up to large problem sizes of over 80 billion voxels per time step and generated images up to 16 megapixels,” said Tom Peterka, a postdoctoral appointee in Argonne’s Mathematics and Computer Science Division.

So it really is more than just a pretty picture: it’s something of a breakthrough in supercomputer visualisations.

Read more about it here.

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU