Typically the supercomputer market is seen as one that is being consumed by large organisations in the world of research and education, but Cray is seeing an increase in sales within the consumer market.

In a recent revenue outlook, the company revealed 15% of the company’s revenue is expected to come from commercial customers, which is around double that from 2014. But why is this growth being experienced?

CBR spoke to Barry Bolding, Chief Strategy Officer and SVP of Cray, about why the explosion of sensor data is helping to boost its business in other areas.

One of the key reasons is that commercial areas are looking to be more capable at handling larger amounts of data, which increases the requirements on simulations. This then leads to the need for bigger and faster storage and faster supercomputers. Bolding says that this is why the company has been growing at the rate it has.

Often I am told that cloud is the great enabler of IoT, which sensors would fall under, so it would make sense that cloud would be used to move these large amounts of data so that they can be analysed.

However, that is when you come across the data gravity problem, which arises becuase data has its own mass. The problem with putting very large pieces of data into the cloud is that it can be difficult to then get out of the cloud.

This isn’t to say that cloud isn’t good or useful, Bolding said: "Cloud can be very useful for putting in vast quantities of data if the data is broken up into very small chunks and you can run distributed simulations on the data."

If you are doing tightly couple simulations on the data though, and that is being done 24/7 then, "cloud becomes a cost inefficient place to do those computations," said Bolding.

This is why companies such as GE and Exxon Mobil are using supercomputers. If you look at the oil and gas industry there are areas of research that produces vast amounts of data and that data can’t really be sent to the cloud in a short period of time.

Clive Longbottom, founder Quocirca, told CBR: "Seismic research provides so much data that it makes far more sense for a company to deal with this in its own way using its own supercomputers."

CERN is another example, with the Large Hadron Collider creating terabytes of data in a single burst which cannot all be sent to the cloud.

"It needs a supercomputer environment to rapidly and effectively deal with the data and do first pass analysis. Once it has filtered out the important data, it can then move this to the cloud for further, batch analysis as required," said Longbottom.

What we come to is figuring out what the cloud is good for and what are supercomputers are good for; this of course comes down to what the workload is, what it requires and how the technologies evolve over time.

It’s difficult to paint a picture of what the future will hold for either technology but Bolding says that the future for supercomputers could see them become indistinguishable from an ‘insight device’ which is essentially analytics.

Bolding said: "We believe that in the future, meaning 10 years from now, a supercomputer will be indistinguishable from an insight device.

"It will provide insight to the key problem that you have with your business and so whether that insight is what should I invest in, whether that is in to is someone trying to infiltrate, how can I predict their behaviour and act before they do."

This type of insight would require a coupling of complex simulations and complex data analytics: "I do believe that SC of tomorrow is an insight device, it has to be providing insight that changes your business on a daily basis," said Bolding.

While Cray’s supercomputers evolve to adjust to people’s needs, so does the cloud market and Cray’s future could well depend on what form that evolution takes.

If cloud companies go for a one-size fits all approach and go for scale out, then supercomputers will have a reasonable future, Longbottom told me.

"However, if cloud providers understand that different workloads need different resources and so create clouds that consist of a range of resource types, then more of what people see as supercomputer workloads can and will be moved to the cloud," said Longbottom.

This is already being done by some vendors like AWS with storage and IBM with SoftLayer, which could significantly increase the pressure on supercomputer vendors if more cloud players go down this route.

Vendors like HP, Dell and Supermicro are all trying to find ways to differentiate themselves from cloud because of the pressure they are under for certain workloads, but the future may hold a place for both supercomputers and cloud to work side-by-side.

Bolding said that he believes that in a modern large commercial company that there is a place for each and that you will see the two complement each other rather than compete.

It may be the case that there will continue to be organisations that see the benefits of having a supercomputer and use the cloud as an off-load engine. This will mean that certain batch jobs will go to the cloud while a supercomputer is used to meet needs when they are required.