View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. Data
August 6, 2015updated 30 Aug 2016 12:38pm

As it hunts for the ‘God’ particle, CERN is changing enterprise storage as we know it

Tarkan Maner, Nexenta chairman & CEO, looks for the big bang in enterprise storage, taking inspiration from the work within the Large Hadron Collider.

By Ellie Burns

After a two year upgrade, the Large Hadron Collider (LHC) at CERN in Switzerland, the biggest and most complex machine ever created by mankind, restarted experiments on June 11th. Now, the LHC will recommence its work at massively increased speed; particles are accelerated to almost the speed of light and crashed into each other in the 16 mile long tunnel which runs one hundred meters underground and houses the experiments in the hope of discovering the secrets of ‘the Big Bang’.

Of course, the amount of data each experiment produces is phenomenal, with each detector inside the LHC having 100 million readout channels and taking 40 million pictures per second. All of these pictures must be saved and, at some point, analysed. During the experiment’s development phase some 15 years ago, CERN knew that the storage technology required to handle the exabytes of data being produced didn’t exist – so, it developed the technology itself.

To save the enormous amount of research data being generated and, in turn, make it readily available to its researchers, it developed a tiered data storage structure with data centres dotted all over the world. With itself as ‘tier zero’, the starting point, the data is then distributed to a small number of tier one centres and on to many more smaller tier two and tier three centres all across the world. Data transfer rates of up to 7Gb/s help to make the data available for the 14 million analysis jobs that are submitted per month.

Over the decades, CERN has gained a reputation as being closely involved with technology development in data storage, cloud-technologies, data analytics and data security that push the limits of existing technologies. Equally, as one of the top research facilities in the world, this practice of being involved in technological advancement has resulted in a number of successful research spin-offs from its primary work. For example, the WWW was invented at CERN, and later CERN developed grid computing to distribute data to 140 data centres in 40 countries around the globe for its more than 10.000 researchers.

Also, with its expertise in particles, analytics and IT, CERN has also contributes to accelerator developments in order to treat cancer more effectively than standard procedures like radiation. Radiation detectors and computer tomography are other examples where technologies from CERN are used in medicine.

It’s interesting to note here that CERN recently launched ‘openlab’, a unique public-private partnership with leading ICT companies to accelerate the development of cutting-edge solutions for use by the worldwide LHC community. The initiative allows CERN access to early developments of new technologies and to work with partners to test their limits to ascertain if they are suited to help with the extreme amounts of data the LHC creates.

For the broader industry, it is a win-win situation: the test-cycles for product development are very short and companies benefit from the toughest test environment for their technologies. If they’re good enough for CERN, then they must be good enough for the wider market?

Content from our partners
Powering AI’s potential: turning promise into reality
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline

CERN’s work might be the most extreme example of data growth, but from a business perspective, many parallels can be drawn. For one, we increasingly find ourselves in a position where data collection is overwhelming the technology being used to collect and store it.

Whether the aim is to save 40 million selfies, 40 billion tweets or 40 million research reports, today’s companies are handling more data than ever before. Secondly, like at CERN, storage is quickly emerging as the critical bottleneck for many organisations to address.

In the UK in particular, many enterprises, regardless of sector, are still wholly reliant on traditional hardware storage solutions. These setups are simply incapable of scaling up or out on-demand – and doing so in a financially sustainable way to boot. It also doesn’t help that legacy storage devices based on proprietary hardware are designed to deploy from a single location and cannot meet the immediate demands of big data.

Fortunately, innovation and discovery is not confined to the walls of the LHC. While many businesses are not in a position equitable to that of CERN – with the time, money and expertise on tap to reimagine and reconfigure the technology they rely on – software-defined, open source-driven models are stepping readily into the void.

Offering a cost-effective flexibility and, when it comes to storage, end-to-end flow provisioning and management, solutions like these future-proof business infrastructure and operations for the years to come.

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU