Big data is big business – everyone knows that. Mobile and internet technology have made data ubiquitous; virtually every activity creates a digital trace—whether it’s going for a run, doing your weekly grocery shop, or even just sitting in traffic – and this information is valuable.

Darren Watkins, managing director for VIRTUS Data Centres
Darren Watkins, managing director for VIRTUS Data Centres.

Industrial giants such as GE and Siemens now call themselves data companies, and organisations over the world understand that harnessing and analysing data is crucial, giving them scope to identify new opportunities, make smarter business decisions and drive profits.

However, whilst for many, big data strategies have become a normal part of doing business, that doesn’t mean it’s easy. According to the NewVantage Partners Big Data Executive Survey 2017, while 95 percent of the Fortune 1000 business leaders said that their organisations had undertaken a big data project in the last five years, less than half (48.4 percent) of business leaders felt that their big data initiatives had achieved measurable results.

So clearly, organisations are facing some major challenges when it comes to implementing their big data strategies. More broadly, there is still an intrinsic fear-factor, that big data works only for the giants – who have the money and wherewithal to do it well.

To some extent that’s true. Looking at the big three, the Economist tells us: “Google can see what people search for, Facebook what they share, and Amazon what they buy. They have a “God’s eye view” of activities in their own markets and beyond”. Of course, this means they can see trends in the market, allowing them to make strategic acquisitions, or develop their own products, and helps, ultimately, to stifle competition.

But we believe the benefits of big data are open to all if we can better understand the challenges and pitfalls, and make positive steps in overcoming them. So, let’s start by looking at the primary concerns of organisations.

 

Storage, storage (and more storage)

It’s perhaps no surprise that the collection and intelligent use of data has reached the top of the agenda for many organisations – and data intelligence has firmly escaped the shackles of being just “an IT issue”. But for us, while big data is undoubtedly now a strategic boardroom discussion, the real issues – and the real solutions – still sit with the IT department within an organisation, and more specifically, in the data centre.

In its Digital Universe report, IDC says that the amount of information stored in the world IT systems is doubling about every two years. And, by 2020, the total amount will be enough to fill a stack of tablets that reaches from the earth to the moon 6.6 times – with enterprises having responsibility or liability for about 85 percent of that information. So, the most obvious challenge associated with big data is simply storing and analysing swathes of information.

At the root of the issue, the key requirements of big data storage are that it can handle very large amounts of data and keep scaling to keep up with growth, and that it can provide the input/output operations per second (IOPS) necessary to deliver data to analytics tools.

Speed is key in big data; one of its key characteristics is that they demand real-time or near real-time responses. Financial applications need to give traders information on commodities quickly, in order to make buy or sell decisions – and succeed in a high velocity and competitive industry. Police departments are increasingly accessing data to give real-time intelligence on suspects, or to make an immediate impact in an investigation or crime scene that wouldn’t have previously been possible.

But this puts intense pressure on the security, servers, storage and network of any organisation – and the impact of these demands is being felt across the entire technological supply chain. IT departments need to deploy more forward-looking capacity management to be able to proactively meet the demands that come with processing, storing and analysing machine generated data.

 

Turning confusion into clarity

Historically, for a data centre to “meet new needs”, it would simply add floor space to accommodate more racks and servers. However, the demands for increased IT resources and productivity have also come hand-in-hand with increased need for higher efficiencies, better cost savings and lower environmental impact. So it’s perhaps not surprising that on-premise IT is on the decline and colocation facilities are becoming increasingly dominant within the enterprise.

High Performance Computing (HPC), once seen as large corporation, is also now being looked at as a way to meet the challenge – and is requiring data centres to adopt high density innovation strategies in order to maximise productivity and efficiency, increase available power density and the “per foot” computing power of the data centre.

And, of course, cloud computing (a HPC user’s dream), offers almost unlimited storage and instantly available and scalable computing resource – offering enterprise users the very real opportunity of renting infrastructure that they could not afford to purchase otherwise.

But with lots of choices available, there remains some confusion in the market. Looking at HPC specifically, Virtus’ own research showed that although most businesses are familiar with high performance computing and high density data centre solutions, confusion remains around the overall costs associated with the technology. Businesses seem unaware of the performance and financial efficiencies achievable through high performance computing deployments into specifically designed high and ultra-high density data centres

Of course, with confusion comes the need for information, and we’d advise seeking knowledge from experts, fully audit possibilities, and importantly, appreciate that one size doesn’t fit all. Organisations need to take a flexible approach to storage and processing. Companies must choose the most appropriate partner that meets their pricing and performance level needs – whether on-premise, in the cloud or both – and have the flexibility to scale their storage and processing capabilities as required. They must also make sure they aren’t paying for more than they need and look for a disruptive commercial model, which gives absolute flexibility – from a rack to a suite, for a day to a decade.

 

The big security challenge

It’s maybe obvious that the more data which is stored, the more vital it is to ensure its security. The big data revolution has moved at considerable speed, and while security catches up organisations are potentially more vulnerable.

Malicious attacks on IT systems are becoming more complex and new malware is constantly being developed – and unfortunately – companies that work with big data face these issues on a daily basis. A lack of data security can lead to great financial losses and reputational damage for a company, and, as far as big data is concerned, losses due to poor IT security can exceed even the worst expectations.

This is another area where colocation wins out – as moving into a shared environment means that IT can more easily expand and grow, without compromising security or performance. Indeed, by choosing colocation, companies are effectively renting a small slice of the best uninterruptible power and grid supply, with backup generators, super-efficient cooling, 24/7 security and dual path multi-fibre connectivity that money can buy – all for a fraction of the cost of buying and implementing them themselves.

There are multiple websites and articles dedicated to asking the right questions of your colocation or cloud provider – and we agree that upfront peace of mind on security issues is vital. Businesses should be asking where the provider’s data centre is located, how it is protected against natural disasters, security threats, power outages, is there a robust disaster recovery plan, are there comprehensive back-up solutions and so on – and how they deal with potential data loss.

 

Success lies in the data centre

The demands that come with big data mean that, ultimately, the data centre now sits firmly at the heart of the business. Apart from being able to store machine generated data, the ability to access and interpret it as meaningful actionable information, very quickly, is vitally important – and therefore a robust and sustainable IT strategy has the potential to give companies huge competitive advantage.

Getting the data centre strategy right means that a company has an intelligent and scalable asset that enables choice and growth. However, get it wrong and it becomes a fundamental constraint for innovation. If they really want to overcome the challenges which have led to the failure of so many big data projects, organisations must ensure their data centre strategy is ready and able to deal with the next generation of computing and performance needs.